Federated Multi-task Learning with Hierarchical Attention for Sensor Data Analytics

被引:8
作者
Chen, Yujing [1 ]
Ning, Yue [2 ]
Chai, Zheng [1 ]
Rangwala, Huzefa [1 ]
机构
[1] George Mason Univ, Dept Comp Sci, Fairfax, VA 22030 USA
[2] Stevens Inst Technol, Dept Comp Sci, Hoboken, NJ 07030 USA
来源
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2020年
关键词
Sensor analytics; Attention mechanism; Multi-task learning;
D O I
10.1109/ijcnn48605.2020.9207508
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The past decade has been marked by the rapid emergence and proliferation of a myriad of small devices, such as smartphones and wearables. There is a critical need for analysis of multivariate temporal data obtained from sensors on these devices. Given the heterogeneity of sensor data, individual devices may not have sufficient quality data to learn an effective model. Factors such as skewed/varied data distributions bring more difficulties to the sensor data analytics. In this paper, we propose to leverage multi-task learning with attention mechanism to perform inductive knowledge transfer among related devices and improve generalization performance. We design a novel federated multi-task hierarchical attention model (FATHOM) that jointly trains classification/regression models from multiple distributed devices. The attention mechanism in the proposed model seeks to extract feature representations from inputs and to learn a shared representation across multiple devices to identify key features at each time step. The underlying temporal and nonlinear relationships are modeled using a combination of attention mechanism and long short-term memory (LSTM) networks. The proposed method outperforms a wide range of competitive baselines in both classification and regression settings on three unbalanced real-world datasets. It also allows for the visual characterization of key features learned at the input task level and the global temporal level.
引用
收藏
页数:8
相关论文
共 50 条
  • [31] Multi-Task Metric Learning on Network Data
    Fang, Chen
    Rockmore, Daniel N.
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PART I, 2015, 9077 : 317 - 329
  • [32] Powering Multi-Task Federated Learning with Competitive GPU Resource Sharing
    Yu, Yongbo
    Yu, Fuxun
    Xu, Zirui
    Wang, Di
    Zhang, Mingjia
    Li, Ang
    Bray, Shawn
    Liu, Chenchen
    Chen, Xiang
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 567 - 571
  • [33] A Personalized Federated Multi-task Learning Scheme for Encrypted Traffic Classification
    Guan, Xueyu
    Du, Run
    Wang, Xiaohan
    Qu, Haipeng
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT III, 2023, 14256 : 258 - 270
  • [34] PFedSA: Personalized Federated Multi-Task Learning via Similarity Awareness
    Ye, Chuyao
    Zheng, Hao
    Hu, Zhigang
    Zheng, Meiguang
    2023 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM, IPDPS, 2023, : 480 - 488
  • [35] Personalized Federated Multi-Task Learning over Wireless Fading Channels
    Mortaheb, Matin
    Vahapoglu, Cemil
    Ulukus, Sennur
    ALGORITHMS, 2022, 15 (11)
  • [36] FEDERATED MULTI-TASK LEARNING FOR THZ WIDEBAND CHANNEL AND DOA ESTIMATION
    Elbir, Ahmet M.
    Shi, Wei
    Mishra, Kumar Vijay
    Chatzinotas, Symeon
    2023 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW, 2023,
  • [37] FedBone: Towards Large-Scale Federated Multi-Task Learning
    Chen, Yi-Qiang
    Zhang, Teng
    Jiang, Xin-Long
    Chen, Qian
    Gao, Chen-Long
    Huang, Wu-Liang
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2024, 39 (05) : 1040 - 1057
  • [38] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [39] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [40] Multi-task learning with Attention : Constructing auxiliary tasks for learning to learn
    Li, Benying
    Dong, Aimei
    2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021), 2021, : 145 - 152