SWANN: Small-World Architecture for Fast Convergence of Neural Networks

被引:4
作者
Javaheripi, Mojan [1 ]
Rouhani, Bita Darvish [2 ]
Koushanfar, Farinaz [1 ]
机构
[1] Univ Calif San Diego, Dept Elect & Comp Engn, San Diego, CA 92093 USA
[2] Microsoft, Redmond, WA 98052 USA
关键词
Deep learning; on-device training; small-world networks; PERFORMANCE; CONSENSUS;
D O I
10.1109/JETCAS.2021.3125309
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
On-device intelligence has become increasingly widespread in the modern smart application landscape. A standing challenge for the applicability of on-device intelligence is the excessively high computation cost of training highly accurate Deep Learning (DL) models. These models require a large number of training iterations to reach a high convergence accuracy, hindering their applicability to resource-constrained embedded devices. This paper proposes a novel transformation which changes the topology of the DL architecture to reach an optimal cross-layer connectivity. This, in turn, significantly reduces the number of training iterations required for reaching a target accuracy. Our transformation leverages the important observation that for a set level of accuracy, convergence is fastest when network topology reaches the boundary of a Small-World Network. Small-world graphs are known to possess a specific connectivity structure that enables enhanced signal propagation among nodes. Our small-world models, called SWANNs, provide several intriguing benefits: they facilitate data (gradient) flow within the network, enable feature-map reuse by adding long-range connections and accommodate various network architectures/datasets. Compared to densely connected networks (e.g., DenseNets), SWANNs require a substantially fewer number of training parameters while maintaining a similar level of classification accuracy. We evaluate our networks on various DL model architectures and image classification datasets, namely, MNIST, CIFAR10, CIFAR100, and ImageNet. Our experiments demonstrate an average of approximate to 2.1 x improvement in convergence speed to the desired accuracy.
引用
收藏
页码:575 / 585
页数:11
相关论文
共 50 条
[31]   Prisoner's dilemma and clusters on small-world networks [J].
Thibert-Plante, Xavier ;
Parrott, Lael .
COMPLEXITY, 2007, 12 (06) :22-36
[32]   Application of the sznajd sociophysics model to small-world networks [J].
Elgazzar, AS .
INTERNATIONAL JOURNAL OF MODERN PHYSICS C, 2001, 12 (10) :1537-1544
[33]   Disease spreading with epidemic alert on small-world networks [J].
Han, Xiao-Pu .
PHYSICS LETTERS A, 2007, 365 (1-2) :1-5
[34]   Synchronizibility of random rewired small-world chaotic networks [J].
Liu Jie ;
Lu Junan .
Proceedings of the 24th Chinese Control Conference, Vols 1 and 2, 2005, :228-231
[35]   Bifurcations and chaos control in discrete small-world networks [J].
李宁 ;
孙海义 ;
张庆灵 .
Chinese Physics B, 2012, 21 (01) :127-132
[36]   Small-world networks in neuronal populations: A computational perspective [J].
Zippo, Antonio G. ;
Gelsomino, Giuliana ;
Van Duin, Pieter ;
Nencini, Sara ;
Caramenti, Gian Carlo ;
Valente, Maurizio ;
Biella, Gabriele E. M. .
NEURAL NETWORKS, 2013, 44 :143-156
[37]   Continuous opinion model in small-world directed networks [J].
Gandica, Yerali ;
del Castillo-Mussot, Marcelo ;
Vazquez, Gerardo J. ;
Rojas, Sergio .
PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2010, 389 (24) :5864-5870
[38]   On Detection and Structural Reconstruction of Small-World Random Networks [J].
Cai, Tony ;
Liang, Tengyuan ;
Rakhlin, Alexander .
IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2017, 4 (03) :165-176
[39]   Cultural and opinion dynamics in small-world "social" networks [J].
Lee, Yunsub .
JOURNAL OF MATHEMATICAL SOCIOLOGY, 2025, 49 (02) :83-108
[40]   Delay Induced Hopf Bifurcation of Small-World Networks [J].
Zhang Chen ;
Donghua Zhao ;
Jiong Ruan .
Chinese Annals of Mathematics, Series B, 2007, 28 :453-462