Block-wise primal-dual algorithms for large-scale doubly penalized ANOVA modeling

被引:3
作者
Fu, Penghui [1 ,2 ]
Tan, Zhiqiang [1 ]
机构
[1] Rutgers State Univ, Dept Stat, Piscataway, NJ 08854 USA
[2] Chinese Univ Hong Kong Shenzhen, Sch Data Sci, Shenzhen 518172, Guangdong, Peoples R China
关键词
ANOVA modeling; Nonparametric regression; Penalized estimation; Primal-dual algorithms; Stochastic algorithms; Stochastic gradient methods; Total variation; SPLITTING ALGORITHM; REGRESSION; OPTIMIZATION; SELECTION;
D O I
10.1016/j.csda.2024.107932
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
For multivariate nonparametric regression, doubly penalized ANOVA modeling (DPAM) has recently been proposed, using hierarchical total variations (HTVs) and empirical norms as penalties on the component functions such as main effects and multi -way interactions in a functional ANOVA decomposition of the underlying regression function. The two penalties play complementary roles: the HTV penalty promotes sparsity in the selection of basis functions within each component function, whereas the empirical -norm penalty promotes sparsity in the selection of component functions. To facilitate large-scale training of DPAM using backfitting or block minimization, two suitable primal -dual algorithms are developed, including both batch and stochastic versions, for updating each component function in single -block optimization. Existing applications of primal -dual algorithms are intractable for DPAM with both HTV and empirical -norm penalties. The validity and advantage of the stochastic primal -dual algorithms are demonstrated through extensive numerical experiments, compared with their batch versions and a previous active -set algorithm, in large-scale scenarios.
引用
收藏
页数:19
相关论文
共 34 条
[1]   A splitting algorithm for dual monotone inclusions involving cocoercive operators [J].
Bang Cong Vu .
ADVANCES IN COMPUTATIONAL MATHEMATICS, 2013, 38 (03) :667-681
[2]   Optimization Methods for Large-Scale Machine Learning [J].
Bottou, Leon ;
Curtis, Frank E. ;
Nocedal, Jorge .
SIAM REVIEW, 2018, 60 (02) :223-311
[3]  
Boyd S., 2010, FOUND TRENDS MACH LE, V3, P1, DOI [10.1561/2200000016, DOI 10.1561/2200000016]
[4]   A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging [J].
Chambolle, Antonin ;
Pock, Thomas .
JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2011, 40 (01) :120-145
[5]   A primal-dual fixed point algorithm for convex separable minimization with applications to image restoration [J].
Chen, Peijun ;
Huang, Jianguo ;
Zhang, Xiaoqun .
INVERSE PROBLEMS, 2013, 29 (02)
[6]   A Primal-Dual Splitting Method for Convex Optimization Involving Lipschitzian, Proximable and Linear Composite Terms [J].
Condat, Laurent .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2013, 158 (02) :460-479
[7]  
Defazio A, 2014, ADV NEUR IN, V27
[8]   A simple algorithm for a class of nonsmooth convex-concave saddle-point problems [J].
Drori, Yoel ;
Sabach, Shoham ;
Teboulle, Marc .
OPERATIONS RESEARCH LETTERS, 2015, 43 (02) :209-214
[9]   A General Framework for a Class of First Order Primal-Dual Algorithms for Convex Optimization in Imaging Science [J].
Esser, Ernie ;
Zhang, Xiaoqun ;
Chan, Tony F. .
SIAM JOURNAL ON IMAGING SCIENCES, 2010, 3 (04) :1015-1046
[10]   MULTIVARIATE ADAPTIVE REGRESSION SPLINES [J].
FRIEDMAN, JH .
ANNALS OF STATISTICS, 1991, 19 (01) :1-67