Pyramid: Enabling Hierarchical Neural Networks with Edge Computing

被引:54
作者
He, Qiang [1 ]
Dong, Zeqian [1 ]
Chen, Feifei [2 ]
Deng, Shuiguang [3 ]
Liang, Weifa [4 ]
Yang, Yun [1 ]
机构
[1] Swinburne Univ Technol, Dept Comp Technol, Hawthorn, Vic, Australia
[2] Deakin Univ, Sch Informat Technol, Geelong, Vic, Australia
[3] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou, Peoples R China
[4] City Univ Hong Kong, Dept Comp Sci, Hong Kong, Peoples R China
来源
PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22) | 2022年
基金
澳大利亚研究理事会; 美国国家科学基金会;
关键词
Web of Things; edge AI; machine learning; edge computing; CHALLENGES; FUTURE;
D O I
10.1145/3485447.3511990
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Machine learning (ML) is powering a rapidly-increasing number of web applications. As a crucial part of 5G, edge computing facilitates edge artificial intelligence (AI) by ML model training and inference at the network edge on edge servers. Compared with centralized cloud AI, edge AI enables low-latency ML inference which is critical to many delay-sensitive web applications, e.g., web AR/VR, web gaming andWeb-of-Things applications. Existing studies of edge AI focused on resource and performance optimization in training and inference, leveraging edge computing merely as a tool to accelerate training and inference processes. However, the unique ability of edge computing to process data with context awareness, a powerful feature for building the web-of-things for smart cities, has not been properly explored. In this paper, we propose a novel framework named Pyramid that unleashes the potential of edge AI by facilitating homogeneous and heterogeneous hierarchical ML inferences. We motivate and present Pyramid with traffic prediction as an illustrative example, and evaluate it through extensive experiments conducted on two real-world datasets. The results demonstrate the superior performance of Pyramid neural networks in hierarchical traffic prediction and weather analysis.
引用
收藏
页码:1860 / 1870
页数:11
相关论文
共 61 条
[1]  
[Anonymous], 2019, 25 ANN INT C MOB COM, DOI DOI 10.1145/3326285.3329055
[2]  
Bai L., 2020, 34 C NEUR INF PROC S
[3]   Large-Scale Machine Learning with Stochastic Gradient Descent [J].
Bottou, Leon .
COMPSTAT'2010: 19TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL STATISTICS, 2010, :177-186
[4]  
Chen CHW, 2001, AIP CONF PROC, V584, P96, DOI 10.1063/1.1405589
[5]   Deep Learning With Edge Computing: A Review [J].
Chen, Jiasi ;
Ran, Xukan .
PROCEEDINGS OF THE IEEE, 2019, 107 (08) :1655-1674
[6]   Joint Data Collection and Resource Allocation for Distributed Machine Learning at the Edge [J].
Chen, Min ;
Wang, Haichuan ;
Meng, Zeyu ;
Xu, Hongli ;
Xu, Yang ;
Liu, Jianchun ;
Huang, He .
IEEE TRANSACTIONS ON MOBILE COMPUTING, 2022, 21 (08) :2876-2894
[7]   The effects of strain and composition on the conduction-band offset of direct band gap type-I GeSn/GeSnSi quantum dots for CMOS compatible mid-IR light source [J].
Chen, Qimiao ;
Zhang, Lin ;
Zhou, Hao ;
Li, Wei ;
Son, Bong Kwon ;
Tan, Chuan Seng .
SEMICONDUCTOR SCIENCE AND TECHNOLOGY, 2020, 35 (02)
[8]  
Chen WL, 2015, PR MACH LEARN RES, V37, P2285
[9]  
Cho K., 2014, ARXIV14061078, DOI [10.48550/arXiv.1406.1078, DOI 10.3115/V1/D14-1179]
[10]  
Dauphin YN, 2017, PR MACH LEARN RES, V70