Manifold regularization based distributed semi-supervised learning algorithm using extreme learning machine over time-varying network

被引:10
作者
Xie, Jin [1 ]
Liu, Sanyang [1 ]
Dai, Hao [2 ]
机构
[1] Xidian Univ, Sch Math & Stat, Xian 710071, Shaanxi, Peoples R China
[2] Xidian Univ, Sch Aerosp Sci & Technol, Xian 710071, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Distributed learning (DL); Semi-supervised learning (SSL); Manifold regularization (MR); Time-varying network; Zero-gradient-sum (ZGS); Extreme learning machine (ELM); OPTIMIZATION; FRAMEWORK; CONSENSUS; ENSEMBLE;
D O I
10.1016/j.neucom.2019.03.079
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper aims to propose a distributed semi-supervised learning (SSL) algorithm using extreme learning machine (ELM) over time-varying communication network, whose topology changes over time rather than being fixed. In distributed SSL problems, training data including labeled and unlabeled samples is separately stored on each node over the communication network and cannot be centrally processed. In order to solve these problems, we propose an algorithm combining the semi-supervised ELM (SS-ELM) algorithm with the zero-gradient-sum (ZGS) distributed optimization strategy. The SS-ELM algorithm, based on the manifold regularization (MR) framework, is used to approximate the mapping of samples on each node over the communication network. Then, the ZGS strategy is used to train the globally optimal coefficients of the single layer feed-forward neural network (SLFNN) corresponding to the SS-ELM algorithm. Thus, we denote the proposed algorithm as the distributed SS-ELM (DSS-ELM) algorithm. During the training process, all nodes over the communication network exchange updated coefficients rather than raw data with their neighboring nodes. It means that the DSS-ELM algorithm is a fully distributed and privacy-preserving algorithm. The convergence of the proposed DSS-ELM is guaranteed by the Lyapunov method. At last, some simulations are presented to show the efficiency of the proposed algorithm. (C) 2019 Published by Elsevier B.V.
引用
收藏
页码:24 / 34
页数:11
相关论文
共 23 条
  • [1] A general framework for population-based distributed optimization over networks
    Ai, Wu
    Chen, Weisheng
    Xie, Jin
    [J]. INFORMATION SCIENCES, 2017, 418 : 136 - 152
  • [2] Distributed learning for feedforward neural networks with random weights using an event-triggered communication scheme
    Ai, Wu
    Chen, Weisheng
    Xie, Jin
    [J]. NEUROCOMPUTING, 2017, 224 : 184 - 194
  • [3] A zero-gradient-sum algorithm for distributed cooperative learning using a feedforward neural network with random weights
    Ai, Wu
    Chen, Weisheng
    Xie, Jin
    [J]. INFORMATION SCIENCES, 2016, 373 : 404 - 418
  • [4] [Anonymous], 2006, IEEE T NEURAL NETWOR
  • [5] [Anonymous], 2006, Manifold regularization: A geometric framework for learning from labeled and unlabeled examples
  • [6] Extreme learning machine and adaptive sparse representation for image classification
    Cao, Jiuwen
    Zhang, Kai
    Luo, Minxia
    Yin, Chun
    Lai, Xiaoping
    [J]. NEURAL NETWORKS, 2016, 81 : 91 - 102
  • [7] Composite function wavelet neural networks with extreme learning machine
    Cao, Jiuwen
    Lin, Zhiping
    Huang, Guang-bin
    [J]. NEUROCOMPUTING, 2010, 73 (7-9) : 1405 - 1416
  • [8] Exponential synchronization for second-order nonlinear systems in complex dynamical networks with time-varying inner coupling via distributed event-triggered transmission strategy
    Dai, Hao
    Chen, Weisheng
    Xie, Jin
    Jia, Jinping
    [J]. NONLINEAR DYNAMICS, 2018, 92 (03) : 853 - 867
  • [9] Exponential synchronization of complex dynamical networks with time-varying inner coupling via event-triggered communication
    Dai, Hao
    Chen, Weisheng
    Jia, Jinping
    Liu, Jiayun
    Zhang, Zhengqiang
    [J]. NEUROCOMPUTING, 2017, 245 : 124 - 132
  • [10] Fully Decentralized Semi-supervised Learning via Privacy-preserving Matrix Completion
    Fierimonte, Roberto
    Scardapane, Simone
    Uncini, Aurelio
    Panella, Massimo
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (11) : 2699 - 2711