Modeling and Library Support for Early-Stage Exploration of Sparse Tensor Accelerator Designs

被引:0
|
作者
Ha, Whoi Ree
Kim, Hyunjun
Paek, Yunheung [1 ]
机构
[1] Seoul Natl Univ, Dept Elect & Comp Engn, Seoul 08826, South Korea
关键词
Tensors; Computational modeling; Costs; Accelerators; Encoding; Deep learning; Estimation; Space missions; AI accelerators; design space exploration; sparsity-aware accelerators; sparse tensor accelerators;
D O I
10.1109/ACCESS.2023.3278274
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Techniques, like pruning and dimension reduction, and characteristics of data for applications, like natural language processing and object detection, introduce sparsity in deep learning models inherently. Sparse tensor accelerators leverage sparsity (0's) in data in order to remove ineffectual computations to speed up the overall run-time. Many researchers have suggested numerous approaches, such as encoding, decoding, non-zero extraction, load balancing and etc. However, since each implementation requires specialized hardware to accommodate the unique features, the design space becomes much larger compared to regular tensor accelerators when designing a new sparse accelerator. Also, these features are hard to compare since the efficiency varies according to application and data sparsity. In this paper, we classify and support the modeling of popularly used features for sparse tensor accelerators. These features enable us to model much larger design space and to estimate their cost more accurately. Library support for these features is also included to make early-stage exploration more realistic. Overall, our experiments show that we can analytically estimate the previously un-modeled components with 93% accuracy on average and provide 19 features as library support.
引用
收藏
页码:55361 / 55369
页数:9
相关论文
共 7 条
  • [1] Metrics for Early-Stage Modeling of Many-Accelerator Architectures
    Nilakantan, Siddharth
    Battle, Steven
    Hempstead, Mark
    IEEE COMPUTER ARCHITECTURE LETTERS, 2013, 12 (01) : 25 - 28
  • [2] NNest: Early-Stage Design Space Exploration Tool for Neural Network Inference Accelerators
    Ke, Liu
    He, Xin
    Zhang, Xuan
    PROCEEDINGS OF THE INTERNATIONAL SYMPOSIUM ON LOW POWER ELECTRONICS AND DESIGN (ISLPED '18), 2018, : 19 - 24
  • [3] Accelerator-mediated access to investors among early-stage start-ups
    Dalle, Jean-Michel
    den Besten, Matthijs
    Morfin, Jeremie
    ANNALS OF OPERATIONS RESEARCH, 2023, 348 (3) : 1925 - 1952
  • [4] Early-stage design support combining machine learning and building information modelling
    Singh, Manav Mahan
    Deb, Chirag
    Geyer, Philipp
    AUTOMATION IN CONSTRUCTION, 2022, 136
  • [5] FARSI: An Early-stage Design Space Exploration Framework to Tame the Domain-specific System-on-chip Complexity
    Boroujerdian, Behzad
    Jing, Ying
    Tripathy, Devashree
    Kumar, Amit
    Subramanian, Lavanya
    Yen, Luke
    Lee, Vincent
    Venkatesan, Vivek
    Jindal, Amit
    Shearer, Robert
    Reddi, Vijay Janapa
    ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (02)
  • [6] Multi-Fidelity Design Framework Integrating Compositional Kernels to Facilitate Early-Stage Design Exploration of Complex Systems
    Charisi, Nikoleta Dimitra
    Hopman, Hans
    Kana, Austin A.
    JOURNAL OF MECHANICAL DESIGN, 2025, 147 (01)
  • [7] Melanoma Clinical Decision Support System: An Artificial Intelligence-Based Tool to Diagnose and Predict Disease Outcome in Early-Stage Melanoma Patients
    Diaz-Ramon, Jose Luis
    Gardeazabal, Jesus
    Izu, Rosa Maria
    Garrote, Estibaliz
    Rasero, Javier
    Apraiz, Aintzane
    Penas, Cristina
    Seijo, Sandra
    Lopez-Saratxaga, Cristina
    De la Pena, Pedro Maria
    Sanchez-Diaz, Ana
    Cancho-Galan, Goikoane
    Velasco, Veronica
    Sevilla, Arrate
    Fernandez, David
    Cuenca, Iciar
    Cortes, Jesus Maria
    Alonso, Santos
    Asumendi, Aintzane
    Boyano, Maria Dolores
    CANCERS, 2023, 15 (07)