Task-Projected Hyperdimensional Computing for Multi-task Learning

被引:1
|
作者
Chang, Cheng-Yang [1 ]
Chuang, Yu-Chuan [1 ]
Wu, An-Yeu [1 ]
机构
[1] Natl Taiwan Univ, Grad Inst Elect Engn, Taipei, Taiwan
来源
ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, AIAI 2020, PT I | 2020年 / 583卷
关键词
Hyperdimensional Computing; Multi-task learning; Redundant dimensionality;
D O I
10.1007/978-3-030-49161-1_21
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Brain-inspired Hyperdimensional (HD) computing is an emerging technique for cognitive tasks in the field of low-power design. As an energyefficient and fast learning computational paradigm, HD computing has shown great success in many real-world applications. However, an HD model incrementally trained on multiple tasks suffers from the negative impacts of catastrophic forgetting. The model forgets the knowledge learned from previous tasks and only focuses on the current one. To the best of our knowledge, no study has been conducted to investigate the feasibility of applying multi-task learning to HD computing. In this paper, we propose Task-Projected Hyperdimensional Computing (TP-HDC) to make the HD model simultaneously support multiple tasks by exploiting the redundant dimensionality in the hyperspace. To mitigate the interferences between different tasks, we project each task into a separate subspace for learning. Compared with the baselinemethod, our approach efficiently utilizes the unused capacity in the hyperspace and shows a 12.8% improvement in averaged accuracy with negligible memory overhead.
引用
收藏
页码:241 / 251
页数:11
相关论文
共 50 条
  • [1] MulTa-HDC: A Multi-Task Learning Framework For Hyperdimensional Computing
    Chang, Cheng-Yang
    Chuang, Yu-Chuan
    Chang, En-Jui
    Wu, An-Yeu Andy
    IEEE TRANSACTIONS ON COMPUTERS, 2021, 70 (08) : 1269 - 1284
  • [2] IP-HDC: Information-Preserved Hyperdimensional Computing for Multi-task Learning
    Chang, Cheng-Yang
    Chuang, Yu-Chuan
    Wu, An-Yeu
    2020 IEEE WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS), 2020, : 146 - 151
  • [3] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [4] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [5] Coded Distributed Computing for Hierarchical Multi-task Learning
    Hu, Haoyang
    Li, Songze
    Cheng, Minquan
    Wu, Youlong
    2023 IEEE INFORMATION THEORY WORKSHOP, ITW, 2023, : 480 - 485
  • [6] Task Variance Regularized Multi-Task Learning
    Mao, Yuren
    Wang, Zekai
    Liu, Weiwei
    Lin, Xuemin
    Hu, Wenbin
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (08) : 8615 - 8629
  • [7] Configured quantum reservoir computing for multi-task machine learning
    Xia, Wei
    Zou, Jie
    Qiu, Xingze
    Chen, Feng
    Zhu, Bing
    Li, Chunhe
    Deng, Dong-Ling
    Li, Xiaopeng
    SCIENCE BULLETIN, 2023, 68 (20) : 2321 - 2329
  • [8] On Multi-Task Learning for Energy Efficient Task Offloading in Multi-UAV Assisted Edge Computing
    Poursiami, Hamed
    Jabbari, Bijan
    2024 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC 2024, 2024,
  • [9] An overview of multi-task learning
    Zhang, Yu
    Yang, Qiang
    NATIONAL SCIENCE REVIEW, 2018, 5 (01) : 30 - 43
  • [10] Boosted multi-task learning
    Olivier Chapelle
    Pannagadatta Shivaswamy
    Srinivas Vadrevu
    Kilian Weinberger
    Ya Zhang
    Belle Tseng
    Machine Learning, 2011, 85 : 149 - 173