Evaluation of Distributed Machine Learning Model for LoRa-ESL

被引:3
|
作者
Khan, Malak Abid Ali [1 ]
Ma, Hongbin [1 ]
Rehman, Zia Ur [1 ]
Jin, Ying [1 ]
Rehman, Atiq Ur [2 ]
机构
[1] Beijing Inst Technol BIT, State Key Lab Intelligent Control & Decis Complex, 5 Zhongguancun South St, Beijing 100081, Peoples R China
[2] Balochistan Univ Informat Technol Engn & Managemen, Dept Elect Engn, Airport Rd, Quetta 87300, Pakistan
基金
中国国家自然科学基金;
关键词
data parallelism; machine clustering; arith-metic distribution; LoRa-ESL;
D O I
10.20965/jaciii.2023.p0700
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
To overcome the previous challenges and to mitigate the retransmission and acknowledgment of LoRa for electric shelf labels, the data parallelism model is used for transmitting the concurrent data from the network server to end devices (EDs) through gateways (GWs). The EDs are designated around the GWs based on machine clustering to minimize data congestion, collision, and overlapping during signal reception. Deployment and redeployment of EDs in the defined clusters depend on arithmetic distribution to reduce the nearfar effect and the overall saturation in the network. To further improve the performance and analyze the behavior of the network, constant uplink power for signal-to-noise (SNR) while dynamic for received signal strength (RSS) has been proposed. In contrast to SNR, the RSS indicator estimates the actual position of the ED to prevent the capture effect. In the experimental implementation, downlink power at the connected defined threshold.
引用
收藏
页码:700 / 709
页数:10
相关论文
共 50 条
  • [21] Security for Distributed Machine Learning
    Gomez, Laurent
    Yu, Tianchi
    Duverger, Patrick
    PROCEEDINGS OF THE 20TH INTERNATIONAL CONFERENCE ON SECURITY AND CRYPTOGRAPHY, SECRYPT 2023, 2023, : 838 - 843
  • [22] A Survey on Distributed Machine Learning
    Verbraeken, Joost
    Wolting, Matthijs
    Katzy, Jonathan
    Kloppenburg, Jeroen
    Verbelen, Tim
    Rellermeyer, Jan S.
    ACM COMPUTING SURVEYS, 2020, 53 (02)
  • [23] A Distributed Algorithm for Machine Learning
    Chen, Shihong
    ADVANCES IN MATERIALS, MACHINERY, ELECTRONICS II, 2018, 1955
  • [24] Distributed Quantum Machine Learning
    Neumann, Niels M. P.
    Wezeman, Robert S.
    INNOVATIONS FOR COMMUNITY SERVICES, I4CS 2022, 2022, 1585 : 281 - 293
  • [25] Collaborative Distributed Machine Learning
    Jin, David
    Kannengiesser, Niclas
    Rank, Sascha
    Sunyaev, Ali
    ACM COMPUTING SURVEYS, 2025, 57 (04)
  • [26] A Distributed Machine Learning Framework
    Alpcan, Tansu
    Bauckhage, Christian
    PROCEEDINGS OF THE 48TH IEEE CONFERENCE ON DECISION AND CONTROL, 2009 HELD JOINTLY WITH THE 2009 28TH CHINESE CONTROL CONFERENCE (CDC/CCC 2009), 2009, : 2546 - 2551
  • [27] Predicting Model Training Time to Optimize Distributed Machine Learning Applications
    Guimaraes, Miguel
    Carneiro, Davide
    Palumbo, Guilherme
    Oliveira, Filipe
    Oliveira, Oscar
    Alves, Victor
    Novais, Paulo
    ELECTRONICS, 2023, 12 (04)
  • [28] Distributed Quantum Machine Learning: Federated and Model-Parallel Approaches
    Wu, Jindi
    Hu, Tianjie
    Li, Qun
    IEEE INTERNET COMPUTING, 2024, 28 (02) : 65 - 72
  • [29] Application of Distributed Machine Learning Model in Fault Diagnosis of Air Preheater
    Lei, Haokun
    Liu, Jian
    Xian, Chun
    2019 4TH INTERNATIONAL CONFERENCE ON SYSTEM RELIABILITY AND SAFETY (ICSRS 2019), 2019, : 312 - 317
  • [30] Model averaging in distributed machine learning: a case study with Apache Spark
    Yunyan Guo
    Zhipeng Zhang
    Jiawei Jiang
    Wentao Wu
    Ce Zhang
    Bin Cui
    Jianzhong Li
    The VLDB Journal, 2021, 30 : 693 - 712