A Stochastic Majorize-Minimize Subspace Algorithm for Online Penalized Least Squares Estimation

被引:31
作者
Chouzenoux, Emilie [1 ,2 ]
Pesquet, Jean-Christophe [3 ]
机构
[1] INRIA Saclay, Ctr Visual Comp, F-92295 Chatenay Malabry, France
[2] Univ Paris Est, CNRS, LIGM, UMR 8049, F-77454 Marne La Vallee, France
[3] Univ Paris Saclay, Cent Supelec, Ctr Visual Comp, F-92295 Chatenay Malabry, France
关键词
Stochastic approximation; optimization; subspace algorithms; memory gradient methods; descent methods; recursive algorithms; majorization-minimization; filter identification; Newton method; sparsity; machine learning; adaptive filtering; CONVERGENCE; REGULARIZATION; OPTIMIZATION; SIGNAL; DESCENT; RLS;
D O I
10.1109/TSP.2017.2709265
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Stochastic approximation techniques play an important role in solving many problems encountered in machine learning or adaptive signal processing. In these contexts, the statistics of the data are often unknown a priori or their direct computation is too intensive, and they have thus to be estimated online from the observed signals. For batch optimization of an objective function being the sum of a data fidelity term and a penalization (e.g., a sparsity promoting function), Majorize-Minimize (MM) methods have recently attracted much interest since they are fast, highly flexible, and effective in ensuring convergence. The goal of this paper is to show how these methods can be successfully extended to the case when the data fidelity term corresponds to a least squares criterion and the cost function is replaced by a sequence of stochastic approximations of it. In this context, we propose an online version of an MM subspace algorithm and we study its convergence by using suitable probabilistic tools. Simulation results illustrate the good practical performance of the proposed algorithm associated with a memory gradient subspace, when applied to both nonadaptive and adaptive filter identification problems.
引用
收藏
页码:4770 / 4783
页数:14
相关论文
共 74 条
[1]   On global and local convergence of half-quadratic algorithms [J].
Allain, M ;
Idier, J ;
Goussard, Y .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2006, 15 (05) :1130-1142
[2]   Online Adaptive Estimation of Sparse Signals: Where RLS Meets the l1-Norm [J].
Angelosante, Daniele ;
Bazerque, Juan Andres ;
Giannakis, Georgios B. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (07) :3436-3447
[3]  
[Anonymous], TECH REP
[4]  
[Anonymous], 2013, ADV NEURAL INFORM PR
[5]  
[Anonymous], 2010, RES DIGITAL VALUE NO
[6]  
[Anonymous], 2013, Adaptive Filtering, DOI [10.1007/978-1-4614-4106-9, DOI 10.1007/978-1-4614-4106-9]
[7]  
[Anonymous], 2016, Wiley Encyclopedia of Electrical and Electronics Engineering
[8]  
[Anonymous], 2003, STOCHASTIC APPROXIMA, DOI DOI 10.1007/978-1-4471-4285-0_3
[9]  
[Anonymous], 2002, ADAPTIVE FILTER THEO
[10]  
[Anonymous], 2013, INT C MACHINE LEARNI