Distributed Deep Learning Optimized System over the Cloud and Smart Phone Devices

被引:14
|
作者
Jiang, Haotian [1 ]
Starkman, James [1 ]
Lee, Yu-Ju [2 ]
Chen, Huan [1 ]
Qian, Xiaoye [1 ]
Huang, Ming-Chun [1 ]
机构
[1] Case Western Reserve Univ, Dept Elect & Comp Sci, Cleveland, OH 44106 USA
[2] Univ Colorado, Dept Comp Sci, Boulder, CO 80309 USA
关键词
Deep learning; Data models; Data mining; Distributed databases; Computational modeling; Computer architecture; Mobile handsets; Distributed deep neural networks; mobile computing; edge computing; deep learning; wearable computers and body area networks; wearable healthcare; NEURAL-NETWORKS; INTERNET; PRIVACY;
D O I
10.1109/TMC.2019.2941492
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep learning has been becoming a promising focus in data mining research. With deep learning techniques, researchers can discover deep properties and features of events from quantitative mobile sensor data. However, many data sources are geographically separated and have strict privacy, security, and regulatory constraints. Upon releasing the privacy-sensitive data, these data sources generally no longer physically possess their data and cannot interfere with the way their personal data being used. Therefore, it is necessary to explore distributed data mining architecture which is able to conduct consensus learning based on needs. Accordingly, we propose a distributed deep learning optimized system which contains a cloud server and multiple smartphone devices with computation capabilities and each device is served as a personal mobile data hub for enabling mobile computing while preserving data privacy. The proposed system keeps the private data locally in smartphones, shares trained parameters, and builds a global consensus model. The feasibility and usability of the proposed system are evaluated by three experiments and related discussion. The experimental results show that the proposed distributed deep learning system can reconstruct the behavior of centralized training. We also measure the cumulative network traffic in different scenarios and show that the partial parameter sharing strategy does not only preserve the performance of the trained model but also can reduce network traffic. User data privacy is protected on two levels. First, local private training data do not need to be shared with other people and the user has full control of their personal training data all the time. Second, only a small fraction of trained gradients of the local model are selected for sharing, which further reduces the risk of information leaking.
引用
收藏
页码:147 / 161
页数:15
相关论文
共 50 条
  • [1] Distributed Deep Neural Networks over the Cloud, the Edge and End Devices
    Teerapittayanon, Surat
    McDanel, Bradley
    Kung, H. T.
    2017 IEEE 37TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2017), 2017, : 328 - 339
  • [2] Transfer Learning Approach to IDS on Cloud IoT Devices Using Optimized CNN
    Okey, Ogobuchi Daniel
    Melgarejo, Dick Carrillo
    Saadi, Muhammad
    Rosa, Renata Lopes
    Kleinschmidt, Joao Henrique
    Rodriguez, Demostenes Zegarra
    IEEE ACCESS, 2023, 11 : 1023 - 1038
  • [3] Recommender System for Optimal Distributed Deep Learning in Cloud Datacenters
    Anwar, Muhammad Hassaan
    Ghafouri, Saeid
    Gill, Sukhpal Singh
    Doyle, Joseph
    WIRELESS PERSONAL COMMUNICATIONS, 2022, 127 (02) : 1453 - 1477
  • [4] Recommender System for Optimal Distributed Deep Learning in Cloud Datacenters
    Muhammad Hassaan Anwar
    Saeid Ghafouri
    Sukhpal Singh Gill
    Joseph Doyle
    Wireless Personal Communications, 2022, 127 : 1453 - 1477
  • [5] IRIS: Smart Phone Aided Intelligent Reimbursement System Using Deep Learning
    Meng, Yang
    Wang, Run
    Wang, Juan
    Yang, Jie
    Gui, Guan
    IEEE ACCESS, 2019, 7 : 165635 - 165645
  • [6] Simple Yet Powerful: Machine Learning-Based IoT Intrusion System With Smart Preprocessing and Feature Generation Rivals Deep Learning
    Kivanc Eren, Kazim
    Kucuk, Kerem
    Ozyurt, Fatih
    Alhazmi, Omar H.
    IEEE ACCESS, 2025, 13 : 41435 - 41455
  • [7] Distributed Deep Neural Network Deployment for Smart Devices from the Edge to the Cloud
    Lin, Chang-You
    Wang, Tzu-Chen
    Chen, Kuan-Chih
    Lee, Bor-Yan
    Kuo, Jian-Jhih
    PROCEEDINGS OF THE 2019 ACM MOBIHOCWORKSHOP ON PERVASIVE SYSTEMS IN THE IOT ERA (PERSIST-IOT '19), 2019, : 43 - 48
  • [8] Deep Smart Scheduling: A Deep Learning Approach for Automated Big Data Scheduling over the Cloud
    Rjoub, Gaith
    Bentahar, Jamal
    Wahab, Omar Abdel
    Bataineh, Ahmed
    2019 7TH INTERNATIONAL CONFERENCE ON FUTURE INTERNET OF THINGS AND CLOUD (FICLOUD 2019), 2019, : 189 - 196
  • [9] Smart in-car camera system using mobile cloud computing framework for deep learning
    Chen, Chien-Hung
    Lee, Che-Rung
    Lu, Walter Chen-Hua
    VEHICULAR COMMUNICATIONS, 2017, 10 : 84 - 90
  • [10] Toward Distributed, Global, Deep Learning Using IoT Devices
    Sudharsan, Bharath
    Patel, Pankesh
    Breslin, John
    Ali, Muhammad Intizar
    Mitra, Karan
    Dustdar, Schahram
    Rana, Omer
    Jayaraman, Prem Prakash
    Ranjan, Rajiv
    Dustdar, Schahram
    IEEE INTERNET COMPUTING, 2021, 25 (03) : 6 - 12