Efficient and Accelerated Online Learning for Sparse Group Lasso

被引:3
|
作者
Li Zhi-Jie [1 ]
Li Yuan-Xiang [1 ]
Wang Feng [1 ]
Yu Fei [1 ]
Xiang Zheng-Long [1 ]
机构
[1] Wuhan Univ, State Key Lab Software Engn, Wuhan 430072, Peoples R China
来源
2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOP (ICDMW) | 2014年
关键词
group lasso; sparsity; online learning; dual averaging method; accelerated convergence; SELECTION; REGRESSION;
D O I
10.1109/ICDMW.2014.94
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The batch-mode group lasso algorithms suffer from the inefficiency and poor scalability, and online learning algorithms for group lasso, is a promising tool for attacking the large-scale problem. However, the low time complexity of current online algorithm often be accompanied by low convergence rate, and the faster convergence rate is a key problem to guarantee the online learning algorithms. We develop a novel accelerated online learning algorithm to solve sparse group lasso model. The sparse group lasso model can achieve more sparsity in both the group level and the individual feature level. By adopting dual averaging method, its worst-case time complexity and memory cost at each iteration are both in the order of O(d), where d is the number of dimensions. Moreover, our online algorithm has a accelerated capability, and its theoretical convergence rate is O(1/T-2) up to T-th step. The experimental results on synthetic and real-world datasets demonstrate the merits of the proposed online algorithm for sparse group lasso.
引用
收藏
页码:1171 / 1177
页数:7
相关论文
共 50 条
  • [1] Accelerated Block Coordinate Descent for Sparse Group Lasso
    Catalina, Alejandro
    Alaiz, Carlos M.
    Dorronsoro, Jose R.
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [2] A group lasso based sparse KNN classifier
    Zheng, Shuai
    Ding, Chris
    PATTERN RECOGNITION LETTERS, 2020, 131 (131) : 227 - 233
  • [3] A Sparse-Group Lasso
    Simon, Noah
    Friedman, Jerome
    Hastie, Trevor
    Tibshirani, Robert
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2013, 22 (02) : 231 - 245
  • [4] An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
    Zhang, Yangjing
    Zhang, Ning
    Sun, Defeng
    Toh, Kim-Chuan
    MATHEMATICAL PROGRAMMING, 2020, 179 (1-2) : 223 - 263
  • [5] An Iterative Sparse-Group Lasso
    Laria, Juan C.
    Carmen Aguilera-Morillo, M.
    Lillo, Rosa E.
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2019, 28 (03) : 722 - 731
  • [6] Sparse Damage Detection with Complex Group Lasso and Adaptive Complex Group Lasso
    Dimopoulos, Vasileios
    Desmet, Wim
    Deckers, Elke
    SENSORS, 2022, 22 (08)
  • [7] Group Guided Sparse Group Lasso Multi-task Learning for Cognitive Performance Prediction of Alzheimer's Disease
    Liu, Xiaoli
    Cao, Peng
    Yang, Jinzhu
    Zhao, Dazhe
    Zaiane, Osmar
    BRAIN INFORMATICS, BI 2017, 2017, 10654 : 202 - 212
  • [8] Hierarchical Sparse Modeling: A Choice of Two Group Lasso Formulations
    Yan, Xiaohan
    Bien, Jacob
    STATISTICAL SCIENCE, 2017, 32 (04) : 531 - 560
  • [9] Sparse group fused lasso for model segmentation: a hybrid approach
    Degras, David
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2021, 15 (03) : 625 - 671
  • [10] GAP Safe Screening Rules for Sparse-Group Lasso
    Ndiaye, Eugene
    Fercoq, Olivier
    Gramfort, Alexandre
    Salmon, Joseph
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29