Efficient and Accelerated Online Learning for Sparse Group Lasso

被引:3
|
作者
Li Zhi-Jie [1 ]
Li Yuan-Xiang [1 ]
Wang Feng [1 ]
Yu Fei [1 ]
Xiang Zheng-Long [1 ]
机构
[1] Wuhan Univ, State Key Lab Software Engn, Wuhan 430072, Peoples R China
来源
2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOP (ICDMW) | 2014年
关键词
group lasso; sparsity; online learning; dual averaging method; accelerated convergence; SELECTION; REGRESSION;
D O I
10.1109/ICDMW.2014.94
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The batch-mode group lasso algorithms suffer from the inefficiency and poor scalability, and online learning algorithms for group lasso, is a promising tool for attacking the large-scale problem. However, the low time complexity of current online algorithm often be accompanied by low convergence rate, and the faster convergence rate is a key problem to guarantee the online learning algorithms. We develop a novel accelerated online learning algorithm to solve sparse group lasso model. The sparse group lasso model can achieve more sparsity in both the group level and the individual feature level. By adopting dual averaging method, its worst-case time complexity and memory cost at each iteration are both in the order of O(d), where d is the number of dimensions. Moreover, our online algorithm has a accelerated capability, and its theoretical convergence rate is O(1/T-2) up to T-th step. The experimental results on synthetic and real-world datasets demonstrate the merits of the proposed online algorithm for sparse group lasso.
引用
收藏
页码:1171 / 1177
页数:7
相关论文
共 50 条
  • [41] Stabilized Sparse Online Learning for Sparse Data
    Ma, Yuting
    Zheng, Tian
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [42] Sparse EEG/MEG source estimation via a group lasso
    Lim, Michael
    Ales, Justin M.
    Cottereau, Benoit R.
    Hastie, Trevor
    Norcia, Anthony M.
    PLOS ONE, 2017, 12 (06):
  • [43] Learning Interactions via Hierarchical Group-Lasso Regularization
    Lim, Michael
    Hastie, Trevor
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2015, 24 (03) : 627 - 654
  • [44] Enhancing Classification Algorithm Recommendation in Automated Machine Learning: A Meta-Learning Approach Using Multivariate Sparse Group Lasso
    Khan, Irfan
    Zhang, Xianchao
    Ayyasamy, Ramesh Kumar
    Alhashmi, Saadat M.
    Rahim, Azizur
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2025, 142 (02): : 1611 - 1636
  • [45] Online learning with sparse labels
    He, Wenwu
    Zou, Fumin
    Liang, Quan
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2019, 31 (23)
  • [46] Enhancing Classification Algorithm Recommendation in Automated Machine Learning: A Meta-Learning Approach Using Multivariate Sparse Group Lasso
    Khan, Irfan
    Zhang, Xianchao
    Ayyasamy, Ramesh Kumar
    Alhashmi, Saadat M.
    Rahim, Azizur
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2025, : 1611 - 1636
  • [47] Multiple Change-Points Estimation in Linear Regression Models via Sparse Group Lasso
    Zhang, Bingwen
    Geng, Jun
    Lai, Lifeng
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2015, 63 (09) : 2209 - 2224
  • [48] Cost-sensitive sparse group online learning for imbalanced data streams
    Chen, Zhong
    Sheng, Victor
    Edwards, Andrea
    Zhang, Kun
    MACHINE LEARNING, 2024, 113 (07) : 4407 - 4444
  • [49] Consistency of the group Lasso and multiple kernel learning
    Bach, Francis R.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2008, 9 : 1179 - 1225
  • [50] Multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure
    Li, Yanming
    Nan, Bin
    Zhu, Ji
    BIOMETRICS, 2015, 71 (02) : 354 - 363