Efficient and Accelerated Online Learning for Sparse Group Lasso

被引:3
|
作者
Li Zhi-Jie [1 ]
Li Yuan-Xiang [1 ]
Wang Feng [1 ]
Yu Fei [1 ]
Xiang Zheng-Long [1 ]
机构
[1] Wuhan Univ, State Key Lab Software Engn, Wuhan 430072, Peoples R China
来源
2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOP (ICDMW) | 2014年
关键词
group lasso; sparsity; online learning; dual averaging method; accelerated convergence; SELECTION; REGRESSION;
D O I
10.1109/ICDMW.2014.94
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The batch-mode group lasso algorithms suffer from the inefficiency and poor scalability, and online learning algorithms for group lasso, is a promising tool for attacking the large-scale problem. However, the low time complexity of current online algorithm often be accompanied by low convergence rate, and the faster convergence rate is a key problem to guarantee the online learning algorithms. We develop a novel accelerated online learning algorithm to solve sparse group lasso model. The sparse group lasso model can achieve more sparsity in both the group level and the individual feature level. By adopting dual averaging method, its worst-case time complexity and memory cost at each iteration are both in the order of O(d), where d is the number of dimensions. Moreover, our online algorithm has a accelerated capability, and its theoretical convergence rate is O(1/T-2) up to T-th step. The experimental results on synthetic and real-world datasets demonstrate the merits of the proposed online algorithm for sparse group lasso.
引用
收藏
页码:1171 / 1177
页数:7
相关论文
共 50 条
  • [31] Bilevel Learning of the Group Lasso Structure
    Frecon, Jordan
    Salzo, Saverio
    Pontil, Massimiliano
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [32] Sparse kernel learning with LASSO and Bayesian inference algorithm
    Gao, Junbin
    Kwan, Paul W.
    Shi, Daming
    NEURAL NETWORKS, 2010, 23 (02) : 257 - 264
  • [33] Metafeature Selection via Multivariate Sparse-Group Lasso Learning for Automatic Hyperparameter Configuration Recommendation
    Deng, Liping
    Chen, Wen-Sheng
    Xiao, Mingqing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12540 - 12552
  • [34] Regularized group regression methods for genomic prediction: Bridge, MCP, SCAD, group bridge, group lasso, sparse group lasso, group MCP and group SCAD
    Joseph O Ogutu
    Hans-Peter Piepho
    BMC Proceedings, 8 (Suppl 5)
  • [35] Multi-Task Learning for Compositional Data via Sparse Network Lasso
    Okazaki, Akira
    Kawano, Shuichi
    ENTROPY, 2022, 24 (12)
  • [36] An Online Learning Algorithm with Dual Sparse Mechanisms
    Wei B.
    Wu R.-F.
    Zhang W.-S.
    Lü J.-Q.
    Wang Y.-Y.
    Xia X.-W.
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2019, 47 (10): : 2202 - 2210
  • [37] Online Learning for Matrix Factorization and Sparse Coding
    Mairal, Julien
    Bach, Francis
    Ponce, Jean
    Sapiro, Guillermo
    JOURNAL OF MACHINE LEARNING RESEARCH, 2010, 11 : 19 - 60
  • [38] Cheese brand identification with Raman spectroscopy and sparse group LASSO
    Zhang, Yinsheng
    Qin, Beibei
    Zhang, Mengrui
    Zhang, Zhengyong
    Wang, Haiyan
    JOURNAL OF FOOD COMPOSITION AND ANALYSIS, 2025, 141
  • [39] Sparse Online Learning via Truncated Gradient
    Langford, John
    Li, Lihong
    Zhang, Tong
    JOURNAL OF MACHINE LEARNING RESEARCH, 2009, 10 : 777 - 801
  • [40] Genetic Variants Detection Based on Weighted Sparse Group Lasso
    Che, Kai
    Chen, Xi
    Guo, Maozu
    Wang, Chunyu
    Liu, Xiaoyan
    FRONTIERS IN GENETICS, 2020, 11