Boost Decentralized Federated Learning in Vehicular Networks by Diversifying Data Sources

被引:7
作者
Su, Dongyuan [1 ]
Zhou, Yipeng [2 ]
Cui, Laizhong [1 ]
机构
[1] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen, Peoples R China
[2] Macquarie Univ, Sch Comp, Fac Sci & Engn, Sydney, NSW, Australia
来源
2022 IEEE 30TH INTERNATIONAL CONFERENCE ON NETWORK PROTOCOLS (ICNP 2022) | 2022年
基金
国家重点研发计划;
关键词
Decentralized Federated Learning; Privacy Protection; Vehicular Networks; KL Divergence; BLOCKCHAIN; FRAMEWORK;
D O I
10.1109/ICNP55882.2022.9940426
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Recently, federated learning (FL) has received intensive research because of its ability in preserving data privacy for scattered clients to collaboratively train machine learning models. Commonly, a parameter server (PS) is deployed for aggregating model parameters contributed by different clients. Decentralized federated learning (DFL) is upgraded from FL which allows clients to aggregate model parameters with their neighbours directly. DFL is particularly feasible for vehicular networks as vehicles communicate with each other in a vehicleto-vehicle (V2V) manner. However, due to the restrictions of vehicle routes and communication distances, it is hard for individual vehicles to sufficiently exchange models with others. Data sources contributing to models on individual vehicles may not diversified enough resulting in poor model accuracy. To address this problem, we propose the DFL-DDS (DFL with diversified Data Sources) algorithm to diversify data sources in DFL. Specifically, each vehicle maintains a state vector to record the contribution weight of each data source to its model. The Kullback-Leibler (KL) divergence is adopted to measure the diversity of a state vector. To boost the convergence of DFL, a vehicle tunes the aggregation weight of each data source by minimizing the KL divergence of its state vector, and its effectiveness in diversifying data sources can be theoretically proved. Finally, the superiority of DFL-DDS is evaluated by extensive experiments (with MNIST and CIFAR-10 datasets) which demonstrate that DFL-DDS can accelerate the convergence of DFL and improve the model accuracy significantly compared with state-of-the-art baselines.
引用
收藏
页数:11
相关论文
共 39 条
  • [1] Albaseer A, 2020, INT WIREL COMMUN, P1666, DOI 10.1109/IWCMC48107.2020.9148475
  • [2] Bai F, 2003, IEEE INFOCOM SER, P825
  • [3] ViCoV: Efficient video streaming for cognitive radio VANET
    Bradai, Abbas
    Ahmed, Toufik
    Benslimane, Abderrahim
    [J]. VEHICULAR COMMUNICATIONS, 2014, 1 (03) : 105 - 122
  • [4] Toward Pre-Empted EV Charging Recommendation Through V2V-Based Reservation System
    Cao, Yue
    Jiang, Tao
    Kaiwartya, Omprakash
    Sun, Hongjian
    Zhou, Huan
    Wang, Ran
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2021, 51 (05): : 3026 - 3039
  • [5] A Hierarchical Blockchain-Enabled Federated Learning Algorithm for Knowledge Sharing in Internet of Vehicles
    Chai, Haoye
    Leng, Supeng
    Chen, Yijin
    Zhang, Ke
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2021, 22 (07) : 3975 - 3986
  • [6] Che ZP, 2019, Arxiv, DOI arXiv:1904.01975
  • [7] FedHealth: A Federated Transfer Learning Framework for Wearable Healthcare
    Chen, Yiqiang
    Qin, Xin
    Wang, Jindong
    Yu, Chaohui
    Gao, Wen
    [J]. IEEE INTELLIGENT SYSTEMS, 2020, 35 (04) : 83 - 93
  • [8] Chiang M, 2021, IEEE C COMPUTER COMM, P1
  • [9] On the Orchestration of Federated Learning Through Vehicular Knowledge Networking
    Deveaux, Duncan
    Higuchi, Takamasa
    Ucar, Seyhan
    Wang, Chang-Heng
    Harri, Jerome
    Altintas, Onur
    [J]. 2020 IEEE VEHICULAR NETWORKING CONFERENCE (VNC), 2020,
  • [10] Federated Learning for Vehicular Internet of Things: Recent Advances and Open Issues
    Du, Zhaoyang
    Wu, Celimuge
    Yoshinaga, Tsutomu
    Yau, Kok-Lim Alvin
    Ji, Yusheng
    Li, Jie
    [J]. IEEE OPEN JOURNAL OF THE COMPUTER SOCIETY, 2020, 1 (01): : 45 - 61