Learning from Distributed Data Sources using Random Vector Functional-Link Networks

被引:9
|
作者
Scardapane, Simone [1 ]
Panella, Massimo [1 ]
Comminiello, Danilo [1 ]
Uncini, Aurelio [1 ]
机构
[1] Univ Roma La Sapienza, Dept Informat Engn Elect & Telecommun DIET, I-00184 Rome, Italy
来源
INNS CONFERENCE ON BIG DATA 2015 PROGRAM | 2015年 / 53卷
关键词
Distributed learning; Random Vector Functional-Link; Multiple data sources; Alternating Direction Method of Multipliers; CONSENSUS;
D O I
10.1016/j.procs.2015.07.324
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
One of the main characteristics in many real-world big data scenarios is their distributed nature. In a machine learning context, distributed data, together with the requirements of preserving privacy and scaling up to large networks, brings the challenge of designing fully decentralized training protocols. In this paper, we explore the problem of distributed learning when the features of every pattern are available throughout multiple agents (as is happening, for example, in a distributed database scenario). We propose an algorithm for a particular class of neural networks, known as Random Vector Functional-Link (RVFL), which is based on the Alternating Direction Method of Multipliers optimization algorithm. The proposed algorithm allows to learn an RVFL network from multiple distributed data sources, while restricting communication to the unique operation of computing a distributed average. Our experimental simulations show that the algorithm is able to achieve a generalization accuracy comparable to a fully centralized solution, while at the same time being extremely efficient.
引用
收藏
页码:468 / 477
页数:10
相关论文
共 50 条
  • [31] Dropout and DropConnect based Ensemble of Random Vector Functional Link Neural Network
    Katuwal, Rakesh
    Suganthan, P. N.
    2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2018, : 1772 - 1778
  • [32] Multi-Label classifier based on Kernel Random Vector Functional Link Network
    Chauhan, Vikas
    Tiwari, Aruna
    Arya, Shivvrat
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [33] MODWT—random vector functional link for river-suspended sediment load prediction
    Barenya Bikash Hazarika
    Deepak Gupta
    Arabian Journal of Geosciences, 2022, 15 (10)
  • [34] Research and application of structure learning algorithm for Bayesian networks from distributed data
    Zhang, SZ
    Ding, H
    Wang, XK
    Liu, H
    2003 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-5, PROCEEDINGS, 2003, : 1667 - 1671
  • [35] Classifying colour differences in dyed fabrics using an improved hunger games search optimised random vector functional link
    Zhang, Xiaochun
    Zhou, Zhiyu
    JOURNAL OF ENGINEERED FIBERS AND FABRICS, 2022, 17
  • [36] A Distributed Support Vector Machine Learning Over Wireless Sensor Networks
    Kim, Woojin
    Stankovic, Milos S.
    Johansson, Karl H.
    Kim, H. Jin
    IEEE TRANSACTIONS ON CYBERNETICS, 2015, 45 (11) : 2599 - 2611
  • [37] Integrating Data Mining Models from Distributed Data Sources
    Wilford-Rivera, Ingrid
    Ruiz-Fernandez, Daniel
    Rosete-Suarez, Alejandro
    Marin-Alonso, Oscar
    DISTRIBUTED COMPUTING AND ARTIFICIAL INTELLIGENCE, 2010, 79 : 389 - +
  • [38] Random vector functional link network for short-term electricity load demand forecasting
    Ren, Ye
    Suganthan, P. N.
    Srikanth, N.
    Amaratunga, Gehan
    INFORMATION SCIENCES, 2016, 367 : 1078 - 1093
  • [39] Walk-forward empirical wavelet random vector functional link for time series forecasting
    Gao, Ruobin
    Du, Liang
    Yuen, Kum Fai
    Suganthan, Ponnuthurai Nagaratnam
    APPLIED SOFT COMPUTING, 2021, 108
  • [40] Deep Reservoir Computing Based Random Vector Functional Link for Non-sequential Classification
    Hu, Minghui
    Gao, Ruobin
    Suganthan, P. N.
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,