Efficient and Accelerated Online Learning for Sparse Group Lasso

被引:3
|
作者
Li Zhi-Jie [1 ]
Li Yuan-Xiang [1 ]
Wang Feng [1 ]
Yu Fei [1 ]
Xiang Zheng-Long [1 ]
机构
[1] Wuhan Univ, State Key Lab Software Engn, Wuhan 430072, Peoples R China
来源
2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOP (ICDMW) | 2014年
关键词
group lasso; sparsity; online learning; dual averaging method; accelerated convergence; SELECTION; REGRESSION;
D O I
10.1109/ICDMW.2014.94
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The batch-mode group lasso algorithms suffer from the inefficiency and poor scalability, and online learning algorithms for group lasso, is a promising tool for attacking the large-scale problem. However, the low time complexity of current online algorithm often be accompanied by low convergence rate, and the faster convergence rate is a key problem to guarantee the online learning algorithms. We develop a novel accelerated online learning algorithm to solve sparse group lasso model. The sparse group lasso model can achieve more sparsity in both the group level and the individual feature level. By adopting dual averaging method, its worst-case time complexity and memory cost at each iteration are both in the order of O(d), where d is the number of dimensions. Moreover, our online algorithm has a accelerated capability, and its theoretical convergence rate is O(1/T-2) up to T-th step. The experimental results on synthetic and real-world datasets demonstrate the merits of the proposed online algorithm for sparse group lasso.
引用
收藏
页码:1171 / 1177
页数:7
相关论文
共 50 条
  • [21] An algorithm for the multivariate group lasso with covariance estimation
    Wilms, I.
    Croux, C.
    JOURNAL OF APPLIED STATISTICS, 2018, 45 (04) : 668 - 681
  • [22] Seagull: lasso, group lasso and sparse-group lasso regularization for linear regression models via proximal gradient descent
    Klosa, Jan
    Simon, Noah
    Westermark, Pal Olof
    Liebscher, Volkmar
    Wittenburg, Doerte
    BMC BIOINFORMATICS, 2020, 21 (01)
  • [23] A Bayesian Lasso based sparse learning model
    Helgoy, Ingvild M.
    Li, Yushu
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2023,
  • [24] Sparse group LASSO based uncertain feature selection
    Xie, Zongxia
    Xu, Yong
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2014, 5 (02) : 201 - 210
  • [25] Efficient Methods for Overlapping Group Lasso
    Yuan, Lei
    Liu, Jun
    Ye, Jieping
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (09) : 2104 - 2116
  • [26] Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference
    Cai, T. Tony
    Zhang, Anru R.
    Zhou, Yuchen
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (09) : 5975 - 6002
  • [27] A fast unified algorithm for solving group-lasso penalize learning problems
    Yang, Yi
    Zou, Hui
    STATISTICS AND COMPUTING, 2015, 25 (06) : 1129 - 1141
  • [28] Autonomous Tracking and State Estimation With Generalized Group Lasso
    Gao, Rui
    Sarkka, Simo
    Claveria-Vega, Ruben
    Godsill, Simon
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (11) : 12056 - 12070
  • [29] Linearized alternating direction method of multipliers for sparse group and fused LASSO models
    Li, Xinxin
    Mo, Lili
    Yuan, Xiaoming
    Zhang, Jianzhong
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2014, 79 : 203 - 221
  • [30] FedGroup-Prune: IoT Device Amicable and Training-Efficient Federated Learning via Combined Group Lasso Sparse Model Pruning
    Chen, Ziyao
    Peng, Jialiang
    Kang, Jiawen
    Niyato, Dusit
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (24): : 40921 - 40932