Hyperdimensional computing: a framework for stochastic computation and symbolic AI

被引:0
|
作者
Heddes, Mike [1 ]
Nunes, Igor [1 ]
Givargis, Tony [1 ]
Nicolau, Alexandru [1 ]
Veidenbaum, Alex [1 ]
机构
[1] Univ Calif Irvine, Dept Comp Sci, Irvine, CA 92617 USA
关键词
Hyperdimensional computing; Vector symbolic architectures; Basis hypervectors; Graph classification; Dynamic hash table; GRAPH; REGRESSION; MODELS;
D O I
10.1186/s40537-024-01010-8
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Hyperdimensional Computing (HDC), also known as Vector Symbolic Architectures (VSA), is a neuro-inspired computing framework that exploits high-dimensional random vector spaces. HDC uses extremely parallelizable arithmetic to provide computational solutions that balance accuracy, efficiency and robustness. The majority of current HDC research focuses on the learning capabilities of these high-dimensional spaces. However, a tangential research direction investigates the properties of these high-dimensional spaces more generally as a probabilistic model for computation. In this manuscript, we provide an approachable, yet thorough, survey of the components of HDC. To highlight the dual use of HDC, we provide an in-depth analysis of two vastly different applications. The first uses HDC in a learning setting to classify graphs. Graphs are among the most important forms of information representation, and graph learning in IoT and sensor networks introduces challenges because of the limited compute capabilities. Compared to the state-of-the-art Graph Neural Networks, our proposed method achieves comparable accuracy, while training and inference times are on average 14.6x and 2.0x faster, respectively. Secondly, we analyse a dynamic hash table that uses a novel hypervector type called circular-hypervectors to map requests to a dynamic set of resources. The proposed hyperdimensional hashing method has the efficiency to be deployed in large systems. Moreover, our approach remains unaffected by a realistic level of memory errors which causes significant mismatches for existing methods.
引用
收藏
页数:32
相关论文
共 50 条
  • [21] On Effects of Compression with Hyperdimensional Computing in Distributed Randomized Neural Networks
    Rosato, Antonello
    Panella, Massimo
    Osipov, Evgeny
    Kleyko, Denis
    ADVANCES IN COMPUTATIONAL INTELLIGENCE (IWANN 2021), PT II, 2021, 12862 : 155 - 167
  • [22] Hyperdimensional computing with holographic and adaptive encoder
    Hernandez-Cano, Alejandro
    Ni, Yang
    Zou, Zhuowen
    Zakeri, Ali
    Imani, Mohsen
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2024, 7
  • [23] Multiarchitecture Hardware Acceleration of Hyperdimensional Computing
    Peitzsch, Ian
    Ciora, Mark
    George, Alan D.
    2023 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE, HPEC, 2023,
  • [24] Frontiers in Edge AI with RISC-V: Hyperdimensional Computing vs. Quantized Neural Networks
    Genssler, Paul R.
    Wasif, Sandy A.
    Wael, Miran
    Novkin, Rodion
    Amrouch, Hussam
    2024 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE, 2024,
  • [25] Hierarchical Hyperdimensional Computing for Energy Efficient Classification
    Imani, Mohsen
    Huang, Chenyu
    Kong, Deqian
    Rosing, Tajana
    2018 55TH ACM/ESDA/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2018,
  • [26] Performance Analysis of Hyperdimensional Computing for Character Recognition
    Manabat, Alec Xavier
    Marcelo, Celine Rose
    Quinquito, Alfonso Louis
    Alvarez, Anastacia
    2019 INTERNATIONAL SYMPOSIUM ON MULTIMEDIA AND COMMUNICATION TECHNOLOGY (ISMAC), 2019,
  • [27] Redundancy Pruning for Binary Hyperdimensional Computing Architectures
    Antonio, Ryan Albert G.
    Alvarez, Anastacia B.
    2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 2097 - 2101
  • [28] On separating long- and short-term memories in hyperdimensional computing
    Teeters, Jeffrey L. L.
    Kleyko, Denis
    Kanerva, Pentti
    Olshausen, Bruno A. A.
    FRONTIERS IN NEUROSCIENCE, 2023, 16
  • [29] Symbolic Hyperdimensional Vectors with Sparse Graph Convolutional Neural Networks
    Cornell, Filip
    Karlgren, Jussi
    Animesh
    Girdzijauskas, Sarunas
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [30] SupportHDC: Hyperdimensional Computing with Scalable Hypervector Sparsity
    Safa, Ali
    Ocket, Ilja
    Catthoor, Francky
    Gielen, Georges
    PROCEEDINGS OF THE 2023 ANNUAL NEURO-INSPIRED COMPUTATIONAL ELEMENTS CONFERENCE, NICE 2023, 2023, : 20 - 25