Playing the Lottery With Concave Regularizers for Sparse Trainable Neural Networks

被引:1
作者
Fracastoro, Giulia [1 ]
Fosson, Sophie M. [2 ]
Migliorati, Andrea [1 ]
Calafiore, Giuseppe C. [1 ]
机构
[1] Politecn Torino, Dept Elect & Telecommun DET, I-10129 Turin, Italy
[2] Politecn Torino, Dept Control & Comp Engn DAUIN, I-10129 Turin, Italy
关键词
Concave regularization; lottery ticket hypothesis (LTH); neural network pruning; sparse optimization; CONVERGENCE; SELECTION;
D O I
10.1109/TNNLS.2024.3373609
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The design of sparse neural networks, i.e., of networks with a reduced number of parameters, has been attracting increasing research attention in the last few years. The use of sparse models may significantly reduce the computational and storage footprint in the inference phase. In this context, the lottery ticket hypothesis (LTH) constitutes a breakthrough result, that addresses not only the performance of the inference phase, but also of the training phase. It states that it is possible to extract effective sparse subnetworks, called winning tickets, that can be trained in isolation. The development of effective methods to play the lottery, i.e., to find winning tickets, is still an open problem. In this article, we propose a novel class of methods to play the lottery. The key point is the use of concave regularization to promote the sparsity of a relaxed binary mask, which represents the network topology. We theoretically analyze the effectiveness of the proposed method in the convex framework. Then, we propose extended numerical tests on various datasets and architectures, that show that the proposed method can improve the performance of state-of-the-art algorithms.
引用
收藏
页码:4575 / 4585
页数:11
相关论文
共 47 条
[1]   On the Convergence of the Iterative Shrinkage/Thresholding Algorithm With a Weakly Convex Penalty [J].
Bayram, Ilker .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (06) :1597-1608
[2]  
Blalock Davis., 2020, P MACHINE LEARNING S, P129
[3]   Minimization of Non-smooth, Non-convex Functionals by Iterative Thresholding [J].
Bredies, Kristian ;
Lorenz, Dirk A. ;
Reiterer, Stefan .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2015, 165 (01) :78-112
[4]   Enhancing Sparsity by Reweighted l1 Minimization [J].
Candes, Emmanuel J. ;
Wakin, Michael B. ;
Boyd, Stephen P. .
JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2008, 14 (5-6) :877-905
[5]  
Cerone V., 2020, 2020 59th IEEE Conference on Decision and Control (CDC), P396, DOI 10.1109/CDC42340.2020.9304508
[6]   Sparsest solutions of underdetermined linear systems via lq-minimization for 0 < q ≤ 1 [J].
Foucart, Simon ;
Lai, Ming-Jun .
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2009, 26 (03) :395-407
[7]  
Frankle J., 2019, INT C LEARN REPR
[8]  
Frankle Jonathan., 2020, INT C MACHINE LEARNI, P3259, DOI [10.48550/arXiv.1912.05671, DOI 10.5555/3524938.3525243]
[9]   Fundamental Technologies in Modern Speech Recognition [J].
Furui, Sadaoki ;
Deng, Li ;
Gales, Mark ;
Ney, Hermann ;
Tokuda, Keiichi .
IEEE SIGNAL PROCESSING MAGAZINE, 2012, 29 (06) :16-17
[10]  
Gong Pinghua, 2013, JMLR Workshop Conf Proc, V28, P37