Sparse Volterra and Polynomial Regression Models: Recoverability and Estimation

被引:62
作者
Kekatos, Vassilis [1 ,2 ]
Giannakis, Georgios B. [1 ,2 ]
机构
[1] Univ Minnesota, Dept Elect & Comp Engn, Minneapolis, MN 55455 USA
[2] Univ Minnesota, Digital Technol Ctr, Minneapolis, MN 55455 USA
基金
美国国家科学基金会;
关键词
Compressive sampling; Lasso; polynomial kernels; restricted isometry properties; Volterra filters; SELECTION; LASSO; WIENER; GENOME;
D O I
10.1109/TSP.2011.2165952
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Volterra and polynomial regression models play a major role in nonlinear system identification and inference tasks. Exciting applications ranging from neuroscience to genome-wide association analysis build on these models with the additional requirement of parsimony. This requirement has high interpretative value, but unfortunately cannot be met by least-squares based or kernel regression methods. To this end, compressed sampling (CS) approaches, already successful in linear regression settings, can offer a viable alternative. The viability of CS for sparse Volterra and polynomial models is the core theme of this work. A common sparse regression task is initially posed for the two models. Building on (weighted) Lasso-based schemes, an adaptive RLS-type algorithm is developed for sparse polynomial regressions. The identifiability of polynomial models is critically challenged by dimensionality. However, following the CS principle, when these models are sparse, they could be recovered by far fewer measurements. To quantify the sufficient number of measurements for a given level of sparsity, restricted isometry properties (RIP) are investigated in commonly met polynomial regression settings, generalizing known results for their linear counterparts. The merits of the novel (weighted) adaptive CS algorithms to sparse polynomial modeling are verified through synthetic as well as real data tests for genotype-phenotype analysis.
引用
收藏
页码:5907 / 5920
页数:14
相关论文
共 32 条
[1]   Online Adaptive Estimation of Sparse Signals: Where RLS Meets the l1-Norm [J].
Angelosante, Daniele ;
Bazerque, Juan Andres ;
Giannakis, Georgios B. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (07) :3436-3447
[2]  
[Anonymous], 2006, Journal of the Royal Statistical Society, Series B
[3]  
[Anonymous], 2010, Theoretical foundations and numerical methods for sparse recovery, DOI DOI 10.1515/9783110226157.1
[4]  
Benedetto S., 1983, IEEE Journal on Selected Areas in Communications, VSAC-1, P57, DOI 10.1109/JSAC.1983.1145885
[5]   The Neurobiological Basis of Cognition: Identification by Multi-Input, Multioutput Nonlinear Dynamic Modeling [J].
Berger, Theodore W. ;
Song, Dong ;
Chan, Rosa H. M. ;
Marmarelis, Vasilis Z. .
PROCEEDINGS OF THE IEEE, 2010, 98 (03) :356-374
[6]   SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR [J].
Bickel, Peter J. ;
Ritov, Ya'acov ;
Tsybakov, Alexandre B. .
ANNALS OF STATISTICS, 2009, 37 (04) :1705-1732
[7]   Decoding by linear programming [J].
Candes, EJ ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) :4203-4215
[8]  
Candes E, 2007, ANN STAT, V35, P2313, DOI 10.1214/009053606000001523
[9]   The restricted isometry property and its implications for compressed sensing [J].
Candes, Emmanuel J. .
COMPTES RENDUS MATHEMATIQUE, 2008, 346 (9-10) :589-592
[10]   Atomic decomposition by basis pursuit [J].
Chen, SSB ;
Donoho, DL ;
Saunders, MA .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1998, 20 (01) :33-61