Enable Deep Learning on Mobile Devices: Methods, Systems, and Applications

被引:51
作者
Cai, Han [1 ]
Lin, Ji [1 ]
Lin, Yujun [1 ]
Liu, Zhijian [1 ]
Tang, Haotian [1 ]
Wang, Hanrui [1 ]
Zhu, Ligeng [1 ]
Han, Song [1 ]
机构
[1] MIT, 77 Massachusetts Ave, Cambridge, MA 02139 USA
关键词
Efficient deep learning; TinyML; model compression; AutoML; neural architecture search; NEURAL-NETWORK ACCELERATOR; ARCHITECTURE; IMPLEMENTATION; COPROCESSOR; PREDICTION; MODEL;
D O I
10.1145/3486618
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Deep neural networks (DNNs) have achieved unprecedented success in the field of artificial intelligence (AI), including computer vision, natural language processing, and speech recognition. However, their superior performance comes at the considerable cost of computational complexity, which greatly hinders their applications in many resource-constrained devices, such as mobile phones and Internet of Things (IoT) devices. Therefore, methods and techniques that are able to lift the efficiency bottleneck while preserving the high accuracy of DNNs are in great demand to enable numerous edge AI applications. This article provides an overview of efficient deep learning methods, systems, and applications. We start from introducing popular model compression methods, including pruning, factorization, quantization, as well as compact model design. To reduce the large design cost of these manual solutions, we discuss the AutoML framework for each of them, such as neural architecture search (NAS) and automated pruning and quantization. We then cover efficient on-device training to enable user customization based on the local data on mobile devices. Apart from general acceleration techniques, we also showcase several task-specific accelerations for point cloud, video, and natural language processing by exploiting their spatial sparsity and temporal/token redundancy. Finally, to support all these algorithmic advancements, we introduce the efficient deep learning system design from both software and hardware perspectives.
引用
收藏
页数:50
相关论文
共 50 条
  • [21] Performance prediction of deep learning applications training in GPU as a service systems
    Lattuada, Marco
    Gianniti, Eugenio
    Ardagna, Danilo
    Zhang, Li
    CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2022, 25 (02): : 1279 - 1302
  • [22] Deep learning forecasting for electric demand applications of cooling systems in buildings
    Runge, Jason
    Zmeureanu, Radu
    ADVANCED ENGINEERING INFORMATICS, 2022, 53
  • [23] Special issue "Deep Learning for Natural Language Processing: Emerging methods and applications"
    Esposito, Massimo
    Fujita, Hamido
    Minutolo, Aniello
    Pota, Marco
    ARRAY, 2022, 14
  • [24] Machine learning for synthetic biology: Methods and applications
    Hu, Ruyun
    Zhang, Songya
    Meng, Hailin
    Yu, Han
    Zhang, Jianzhi
    Luo, Xiaozhou
    Si, Tong
    Liu, Chenli
    Qiao, Yu
    CHINESE SCIENCE BULLETIN-CHINESE, 2021, 66 (03): : 284 - 299
  • [25] Advances and applications of machine learning and deep learning in environmental ecology and health
    Cui, Shixuan
    Gao, Yuchen
    Huang, Yizhou
    Shen, Lilai
    Zhao, Qiming
    Pan, Yaru
    Zhuang, Shulin
    ENVIRONMENTAL POLLUTION, 2023, 335
  • [26] Mobile Sensing Through Deep Learning
    Zeng, Xiao
    MOBISYS'17 PHD FORUM: PROCEEDINGS OF THE 2017 WORKSHOP ON MOBISYS 2017 PH.D. FORUM, 2017, : 5 - 6
  • [27] Automatic Learning Rate Adaption for Memristive Deep Learning Systems
    Zhang, Yang
    Shen, Linlin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (08) : 10791 - 10802
  • [28] Mobile Devices for Teaching and Learning in Higher Education
    Mills, Henny
    PROCEEDINGS OF THE 14TH EUROPEAN CONFERENCE ON R-LEARNING (ECEL 2015), 2015, : 372 - 381
  • [29] DMS: Dynamic Model Scaling for Quality-Aware Deep Learning Inference in Mobile and Embedded Devices
    Kang, Woochul
    Kim, Daeyeon
    Park, Junyoung
    IEEE ACCESS, 2019, 7 : 168048 - 168059
  • [30] A Comprehensive Survey on Deep Graph Representation Learning Methods
    Chikwendu, Ijeoma Amuche
    Zhang, Xiaoling
    Agyemang, Isaac Osei
    Adjei-Mensah, Isaac
    Chima, Ukwuoma Chiagoziem
    Ejiyi, Chukwuebuka Joseph
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2023, 78 : 287 - 356