Design space exploration of neural network accelerator based on transfer learning

被引:0
作者
Wu Y. [1 ]
Zhi T. [2 ]
Song X. [2 ]
Li X. [1 ]
机构
[1] School of Computer Science, University of Science and Technology of China, Hefei
[2] State Key Lab of Processors, Institute of Computing Technology, Chinese Academy of Sciences, Beijing
基金
中国国家自然科学基金;
关键词
design space exploration (DSE); multi-task learning; neural network accelerator; transfer learning;
D O I
10.3772/j.issn.1006-6748.2023.04.009
中图分类号
学科分类号
摘要
With the increasing demand of computational power in artificial intelligence (AI) algorithms, dedicated accelerators have become a necessity. However, the complexity of hardware architectures, vast design search space, and complex tasks of accelerators have posed significant challenges. Traditional search methods can become prohibitively slow if the search space continues to be expanded. A design space exploration (DSE) method is proposed based on transfer learning, which reduces the time for repeated training and uses multi-task models for different tasks on the same processor. The proposed method accurately predicts the latency and energy consumption associated with neural network accelerator design parameters, enabling faster identification of optimal outcomes compared with traditional methods. And compared with other DSE methods by using multilayer perceptron (MLP), the required training time is shorter. Comparative experiments with other methods demonstrate that the proposed method improves the efficiency of DSE without compromising the accuracy of the results. © 2023 Inst. of Scientific and Technical Information of China. All rights reserved.
引用
收藏
页码:416 / 426
页数:10
相关论文
共 50 条
  • [1] Transfer Learning for Design-Space Exploration with High-Level Synthesis
    Kwon, Jihye
    Carloni, Luca P.
    PROCEEDINGS OF THE 2020 ACM/IEEE 2ND WORKSHOP ON MACHINE LEARNING FOR CAD (MLCAD '20), 2020, : 163 - 168
  • [2] Neural Network-Based Limiter with Transfer Learning
    Abgrall, Remi
    Han Veiga, Maria
    COMMUNICATIONS ON APPLIED MATHEMATICS AND COMPUTATION, 2020,
  • [3] Neural Network-Based Limiter with Transfer Learning
    Abgrall, Remi
    Han Veiga, Maria
    COMMUNICATIONS ON APPLIED MATHEMATICS AND COMPUTATION, 2023, 5 (02) : 532 - 572
  • [4] Neural Network-Based Limiter with Transfer Learning
    Rémi Abgrall
    Maria Han Veiga
    Communications on Applied Mathematics and Computation, 2023, 5 (2) : 532 - 572
  • [5] FSS: algorithm and neural network accelerator for style transfer
    Ling, Yi
    Huang, Yujie
    Cai, Yujie
    Li, Zhaojie
    Wang, Mingyu
    Li, Wenhong
    Zeng, Xiaoyang
    SCIENCE CHINA-INFORMATION SCIENCES, 2024, 67 (02)
  • [6] A transfer learning based artificial neural network in geometrical design of textured surfaces for tribological applications
    Mousavirad, Seyed Jalaleddin
    Rahmani, Ramin
    Dolatabadi, Nader
    SURFACE TOPOGRAPHY-METROLOGY AND PROPERTIES, 2023, 11 (02):
  • [7] Insights from the reciprocal space revealed by a convolutional neural network and transfer learning
    Gomez-Peralta, J. I.
    Bokhimi, X.
    Quintana-Owen, P.
    SCRIPTA MATERIALIA, 2025, 263
  • [8] Neural Network Based Transfer Learning for Robot Path Generation
    Tang, Houcheng
    Notash, Leila
    JOURNAL OF MECHANISMS AND ROBOTICS-TRANSACTIONS OF THE ASME, 2022, 14 (04):
  • [9] Convolution Neural Network based Transfer Learning for Classification of Flowers
    Wu, Yong
    Qin, Xiao
    Pan, Yonghua
    Yuan, Changan
    2018 IEEE 3RD INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING (ICSIP), 2018, : 562 - 566
  • [10] Image Classification Based on transfer Learning of Convolutional neural network
    Wang, Yunyan
    Wang, Chongyang
    Luo, Lengkun
    Zhou, Zhigang
    PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 7506 - 7510