Recursive Least Squares Dictionary Learning Algorithm

被引:298
作者
Skretting, Karl [1 ]
Engan, Kjersti [1 ]
机构
[1] Univ Stavanger, Dept Elect & Comp Engn, IDE, N-4036 Stavanger, Norway
关键词
Dictionary learning; dictionary design; sparse approximation;
D O I
10.1109/TSP.2010.2040671
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We present the recursive least squares dictionary learning algorithm, RLS-DLA, which can be used for learning overcomplete dictionaries for sparse signal representation. Most DLAs presented earlier, for example ILS-DLA and K-SVD, update the dictionary after a batch of training vectors has been processed, usually using the whole set of training vectors as one batch. The training set is used iteratively to gradually improve the dictionary. The approach in RLS-DLA is a continuous update of the dictionary as each training vector is being processed. The core of the algorithm is compact and can be effectively implemented. The algorithm is derived very much along the same path as the recursive least squares (RLS) algorithm for adaptive filtering. Thus, as in RLS, a forgetting factor lambda can be introduced and easily implemented in the algorithm. Adjusting lambda in an appropriate way makes the algorithm less dependent on the initial dictionary and it improves both convergence properties of RLS-DLA as well as the representation ability of the resulting dictionary. Two sets of experiments are done to test different methods for learning dictionaries. The goal of the first set is to explore some basic properties of the algorithm in a simple setup, and for the second set it is the reconstruction of a true underlying dictionary. The first experiment confirms the conjectural properties from the derivation part, while the second demonstrates excellent performance.
引用
收藏
页码:2121 / 2130
页数:10
相关论文
共 34 条
[1]   On the uniqueness of overcomplete dictionaries, and a practical way to retrieve them [J].
Aharon, Michal ;
Elad, Michael ;
Bruckstein, Alfred M. .
LINEAR ALGEBRA AND ITS APPLICATIONS, 2006, 416 (01) :48-67
[2]   K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation [J].
Aharon, Michal ;
Elad, Michael ;
Bruckstein, Alfred .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2006, 54 (11) :4311-4322
[3]  
[Anonymous], 1994, THESIS NEW YORK U NE
[4]  
Bottou L., 1995, Advances in Neural Information Processing Systems 7, P585
[5]  
Cotter SF, 2002, CONF REC ASILOMAR C, P963
[6]   Forward sequential algorithms for best basis selection [J].
Cotter, SF ;
Adler, J ;
Rao, BD ;
Kreutz-Delgado, K .
IEE PROCEEDINGS-VISION IMAGE AND SIGNAL PROCESSING, 1999, 146 (05) :235-244
[7]   NOTE ON GROUPING [J].
COX, DR .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1957, 52 (280) :543-547
[8]  
Darken C., 1990, Adv Neural inf Process Syst, V3, P832
[9]  
Daubechies I., 1992, SOC IND APPL MATH, V61, P53, DOI [DOI 10.1137/1.9781611970104, 10.1137/1.9781611970104]
[10]  
Engan K, 1999, INT CONF ACOUST SPEE, P2443, DOI 10.1109/ICASSP.1999.760624