Sparse Learning with Stochastic Composite Optimization

被引:14
|
作者
Zhang, Weizhong [1 ]
Zhang, Lijun [2 ]
Jin, Zhongming [1 ]
Jin, Rong [3 ]
Cai, Deng [1 ]
Li, Xuelong [4 ]
Liang, Ronghua [5 ]
He, Xiaofei [1 ]
机构
[1] Zhejiang Univ, Coll Comp Sci, State Key Lab CAD&CG, 388 Yuhang Tang Rd, Hangzhou 310058, Zhejiang, Peoples R China
[2] Nanjing Univ, Natl Key Lab Novel Software Technol, Nanjing 210023, Jiangsu, Peoples R China
[3] Alibaba Grp, Seattle, WA 98057 USA
[4] Chinese Acad Sci, State Key Lab Transicent Opt & Photon, Xian Inst Opt & Precis Mech, Ctr OPT IMagery Anal & Learning OPTIMAL, Xian 710119, Shaanxi, Peoples R China
[5] Zhejiang Univ Technol, Coll Informat Engn, 288 Liuhe Rd, Hangzhou 310058, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Sparse learning; stochastic optimization; stochastic composite optimization; ONLINE; ALGORITHMS;
D O I
10.1109/TPAMI.2016.2578323
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we study Stochastic Composite Optimization (SCO) for sparse learning that aims to learn a sparse solution from a composite function. Most of the recent SCO algorithms have already reached the optimal expected convergence rate O(1/lambda T), but they often fail to deliver sparse solutions at the end either due to the limited sparsity regularization during stochastic optimization (SO) or due to the limitation in online-to-batch conversion. Even when the objective function is strongly convex, their high probability bounds can only attain O(root log(1/delta)/T with delta is the failure probability, which is much worse than the expected convergence rate. To address these limitations, we propose a simple yet effective two-phase Stochastic Composite Optimization scheme by adding a novel powerful sparse online-to-batch conversion to the general Stochastic Optimization algorithms. We further develop three concrete algorithms, OptimalSL, LastSL and AverageSL, directly under our scheme to prove the effectiveness of the proposed scheme. Both the theoretical analysis and the experiment results show that our methods can really outperform the existing methods at the ability of sparse learning and at the meantime we can improve the high probability bound to approximately O(log (log (T)/delta)/lambda T).
引用
收藏
页码:1223 / 1236
页数:14
相关论文
共 50 条
  • [1] Sparse Learning for Stochastic Composite Optimization
    Zhang, Weizhong
    Zhang, Lijun
    Hu, Yao
    Jin, Rong
    Cai, Deng
    He, Xiaofei
    PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 893 - 899
  • [2] Stochastic Variance Reduced Optimization for Nonconvex Sparse Learning
    Li, Xingguo
    Zhao, Tuo
    Arora, Raman
    Liu, Han
    Haupt, Jarvis
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [3] Stochastic optimization of linear sparse arrays
    Trucco, A
    Murino, V
    IEEE JOURNAL OF OCEANIC ENGINEERING, 1999, 24 (03) : 291 - 299
  • [4] Sparse learning of stochastic dynamical equations
    Boninsegna, Lorenzo
    Nueske, Feliks
    Clementi, Cecilia
    JOURNAL OF CHEMICAL PHYSICS, 2018, 148 (24):
  • [5] STOCHASTIC OPTIMIZATION OF SPARSE ANTENNA ARRAYS.
    Minvielle, P.
    Berisset, Ph
    PROCEEDINGS OF THE FOURTH EUROPEAN CONFERENCE ON ANTENNAS AND PROPAGATION, 2010,
  • [6] Hybrid SGD algorithms to solve stochastic composite optimization problems with application in sparse portfolio selection problems
    Yang, Zhen-Ping
    Zhao, Yong
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2024, 436
  • [7] Distributed Derivative-Free Learning Method for Stochastic Optimization Over a Network With Sparse Activity
    Li, Wenjie
    Assaad, Mohamad
    Zheng, Shiqi
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (05) : 2221 - 2236
  • [8] Asynchronous Doubly Stochastic Sparse Kernel Learning
    Gu, Bin
    Miao, Xin
    Huo, Zhouyuan
    Huang, Heng
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 3085 - 3092
  • [9] SASBO: Sparse Attack via Stochastic Binary Optimization
    Meng, Yihan
    Li, Weitao
    Shang, Lin
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PT I, PAKDD 2024, 2024, 14645 : 117 - 129
  • [10] Stochastic optimization using a sparse grid collocation scheme
    Sankaran, Sethuraman
    PROBABILISTIC ENGINEERING MECHANICS, 2009, 24 (03) : 382 - 396