SWANN: Small-World Architecture for Fast Convergence of Neural Networks

被引:3
作者
Javaheripi, Mojan [1 ]
Rouhani, Bita Darvish [2 ]
Koushanfar, Farinaz [1 ]
机构
[1] Univ Calif San Diego, Dept Elect & Comp Engn, San Diego, CA 92093 USA
[2] Microsoft, Redmond, WA 98052 USA
关键词
Deep learning; on-device training; small-world networks; PERFORMANCE; CONSENSUS;
D O I
10.1109/JETCAS.2021.3125309
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
On-device intelligence has become increasingly widespread in the modern smart application landscape. A standing challenge for the applicability of on-device intelligence is the excessively high computation cost of training highly accurate Deep Learning (DL) models. These models require a large number of training iterations to reach a high convergence accuracy, hindering their applicability to resource-constrained embedded devices. This paper proposes a novel transformation which changes the topology of the DL architecture to reach an optimal cross-layer connectivity. This, in turn, significantly reduces the number of training iterations required for reaching a target accuracy. Our transformation leverages the important observation that for a set level of accuracy, convergence is fastest when network topology reaches the boundary of a Small-World Network. Small-world graphs are known to possess a specific connectivity structure that enables enhanced signal propagation among nodes. Our small-world models, called SWANNs, provide several intriguing benefits: they facilitate data (gradient) flow within the network, enable feature-map reuse by adding long-range connections and accommodate various network architectures/datasets. Compared to densely connected networks (e.g., DenseNets), SWANNs require a substantially fewer number of training parameters while maintaining a similar level of classification accuracy. We evaluate our networks on various DL model architectures and image classification datasets, namely, MNIST, CIFAR10, CIFAR100, and ImageNet. Our experiments demonstrate an average of approximate to 2.1 x improvement in convergence speed to the desired accuracy.
引用
收藏
页码:575 / 585
页数:11
相关论文
共 50 条
  • [21] Small-world networks of optical fiber lattices
    Perakis, F.
    Mattheakis, M.
    Tsironis, G. P.
    JOURNAL OF OPTICS, 2014, 16 (10)
  • [22] Instability of spiral wave in small-world networks
    Gan Zheng-Ning
    Ma Jun
    Zhang Guo-Yong
    Chen Yong
    ACTA PHYSICA SINICA, 2008, 57 (09) : 5400 - 5406
  • [23] Efficient associative memory using small-world architecture
    Bohland, JW
    Minai, AA
    NEUROCOMPUTING, 2001, 38 : 489 - 496
  • [24] The "small-world" topology of rock fracture networks
    Valentini, Luca
    Perugini, Diego
    Poli, Giampiero
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2007, 377 (01) : 323 - 328
  • [25] Information Dynamics in Small-World Boolean Networks
    Lizier, Joseph T.
    Pritam, Siddharth
    Prokopenko, Mikhail
    ARTIFICIAL LIFE, 2011, 17 (04) : 293 - 314
  • [26] Synchronization of Kuramoto oscillators in small-world networks
    Zhang, Yaofeng
    Xiao, Renbin
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2014, 416 : 33 - 40
  • [27] Prisoner's dilemma and clusters on small-world networks
    Thibert-Plante, Xavier
    Parrott, Lael
    COMPLEXITY, 2007, 12 (06) : 22 - 36
  • [28] Application of the sznajd sociophysics model to small-world networks
    Elgazzar, AS
    INTERNATIONAL JOURNAL OF MODERN PHYSICS C, 2001, 12 (10): : 1537 - 1544
  • [29] Synchronizibility of random rewired small-world chaotic networks
    Liu Jie
    Lu Junan
    Proceedings of the 24th Chinese Control Conference, Vols 1 and 2, 2005, : 228 - 231
  • [30] Disease spreading with epidemic alert on small-world networks
    Han, Xiao-Pu
    PHYSICS LETTERS A, 2007, 365 (1-2) : 1 - 5