PSO-Convolutional Neural Networks With Heterogeneous Learning Rate

被引:7
作者
Phong, Nguyen Huu [1 ]
Santos, Augusto [1 ]
Ribeiro, Bernardete [1 ]
机构
[1] Univ Coimbra, Dept Informat Engn, CISUC, P-3030290 Coimbra, Portugal
关键词
Computer vision; convolutional neural networks; deep learning; image classification; distributed computing; k-nearest neighbors; particle swarm optimization;
D O I
10.1109/ACCESS.2022.3201142
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Convolutional Neural Networks (ConvNets or CNNs) have been candidly deployed in the scope of computer vision and related fields. Nevertheless, the dynamics of training of these neural networks lie still elusive: it is hard and computationally expensive to train them. A myriad of architectures and training strategies have been proposed to overcome this challenge and address several problems in image processing such as speech, image and action recognition as well as object detection. In this article, we propose a novel Particle Swarm Optimization (PSO) based training for ConvNets. In such framework, the vector of weights of each ConvNet is typically cast as the position of a particle in phase space whereby PSO collaborative dynamics intertwines with Stochastic Gradient Descent (SGD) in order to boost training performance and generalization. Our approach goes as follows: i) [regular phase] each ConvNet is trained independently via SGD; ii) [collaborative phase] ConvNets share among themselves their current vector of weights (or particle-position) along with their gradient estimates of the Loss function. Distinct step sizes are coined by distinct ConvNets. By properly blending ConvNets with large (possibly random) step-sizes along with more conservative ones, we propose an algorithm with competitive performance with respect to other PSO-based approaches on Cifar-10 and Cifar-100 (accuracy of 98:31% and 87:48%). These accuracy levels are obtained by resorting to only four ConvNets - such results are expected to scale with the number of collaborative ConvNets accordingly. We make our source codes available for download https://github.com/leonlha/ PSO-ConvNet-Dynamics.
引用
收藏
页码:89970 / 89988
页数:19
相关论文
共 63 条
[1]   Improving the Performance of Deep Neural Networks Using Two Proposed Activation Functions [J].
Alkhouly, Asmaa A. ;
Mohammed, Ammar ;
Hefny, Hesham A. .
IEEE ACCESS, 2021, 9 :82249-82271
[2]  
[Anonymous], 2009, CIFAR-100 Dataset
[3]  
[Anonymous], 2012, COURSERA NEURAL NETW
[4]  
[Anonymous], 2010, P ADV NEUR INF PROC
[5]  
[Anonymous], 2001, THESIS
[6]  
Bansal J. C., 2011, 2011 3 WORLD C NAT B, P633, DOI [10.1109/NaBIC.2011.6089659, DOI 10.1109/NABIC.2011.6089659]
[7]  
Bengio Yoshua, 2012, Neural Networks: Tricks of the Trade. Second Edition: LNCS 7700, P437, DOI 10.1007/978-3-642-35289-8_26
[8]  
Chen X., 2021, ARXIV
[9]   Xception: Deep Learning with Depthwise Separable Convolutions [J].
Chollet, Francois .
30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, :1800-1807
[10]  
Coates A., 2011, AISTATS