Spiking neural networks for nonlinear regression

被引:11
作者
Henkes, Alexander [1 ,2 ]
Eshraghian, Jason K. [3 ]
Wessels, Henning [2 ]
机构
[1] Swiss Fed Inst Technol, Computat Mech Grp, Zurich, Switzerland
[2] Tech Univ Carolo Wilhelmina Braunschweig, Div Data Driven Modeling Mech Syst, Braunschweig, Germany
[3] Univ Calif Santa Cruz, Neuromorph Comp Grp, Santa Cruz, CA USA
来源
ROYAL SOCIETY OPEN SCIENCE | 2024年 / 11卷 / 05期
关键词
artificial neural networks; spiking neural networks; regression; neuromorphic computing; UNIVERSAL APPROXIMATION; OPERATORS;
D O I
10.1098/rsos.231606
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Spiking neural networks (SNN), also often referred to as the third generation of neural networks, carry the potential for a massive reduction in memory and energy consumption over traditional, second-generation neural networks. Inspired by the undisputed efficiency of the human brain, they introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware. Energy efficiency plays a crucial role in many engineering applications, for instance, in structural health monitoring. Machine learning in engineering contexts, especially in data-driven mechanics, focuses on regression. While regression with SNN has already been discussed in a variety of publications, in this contribution, we provide a novel formulation for its accuracy and energy efficiency. In particular, a network topology for decoding binary spike trains to real numbers is introduced, using the membrane potential of spiking neurons. Several different spiking neural architectures, ranging from simple spiking feed-forward to complex spiking long short-term memory neural networks, are derived. Since the proposed architectures do not contain any dense layers, they exploit the full potential of SNN in terms of energy efficiency. At the same time, the accuracy of the proposed SNN architectures is demonstrated by numerical examples, namely different material models. Linear and nonlinear, as well as history-dependent material models, are examined. While this contribution focuses on mechanical examples, the interested reader may regress any custom function by adapting the published source code.
引用
收藏
页数:23
相关论文
共 111 条
[11]  
Bellec G, 2018, ADV NEUR IN, V31
[12]  
Berner J, 2021, Arxiv, DOI arXiv:2105.04026
[13]   Three ways to solve partial differential equations with neural networks — A review [J].
Blechschmidt J. ;
Ernst O.G. .
GAMM Mitteilungen, 2021, 44 (02)
[14]   A Review of the Application of Machine Learning and Data Mining Approaches in Continuum Materials Mechanics [J].
Bock, Frederic E. ;
Aydin, Roland C. ;
Cyron, Christian J. ;
Huber, Norbert ;
Kalidindi, Surya R. ;
Klusemann, Benjamin .
FRONTIERS IN MATERIALS, 2019, 6
[15]  
BUFFA G., 2012, JOURNAL OF MANUFACTURING PROCESSES, V14, P289, DOI [10.1016/j.jmapro.2011.10.007, DOI 10.1016/J.JMAPRO.2011.10.007]
[16]   Neuromorphic computing using non-volatile memory [J].
Burr, Geoffrey W. ;
Shelby, Robert M. ;
Sebastian, Abu ;
Kim, Sangbum ;
Kim, Seyoung ;
Sidler, Severin ;
Virwani, Kumar ;
Ishii, Masatoshi ;
Narayanan, Pritish ;
Fumarola, Alessandro ;
Sanches, Lucas L. ;
Boybat, Irem ;
Le Gallo, Manuel ;
Moon, Kibong ;
Woo, Jiyoo ;
Hwang, Hyunsang ;
Leblebici, Yusuf .
ADVANCES IN PHYSICS-X, 2017, 2 (01) :89-124
[17]   Physics-informed neural networks (PINNs) for fluid mechanics: a review [J].
Cai, Shengze ;
Mao, Zhiping ;
Wang, Zhicheng ;
Yin, Minglang ;
Karniadakis, George Em .
ACTA MECHANICA SINICA, 2021, 37 (12) :1727-1738
[18]   Physics-Informed Neural Networks for Heat Transfer Problems [J].
Cai, Shengze ;
Wang, Zhicheng ;
Wang, Sifan ;
Perdikaris, Paris ;
Karniadakis, George E. M. .
JOURNAL OF HEAT TRANSFER-TRANSACTIONS OF THE ASME, 2021, 143 (06)
[19]   Hand-Gesture Recognition Based on EMG and Event-Based Camera Sensor Fusion: A Benchmark in Neuromorphic Computing [J].
Ceolini, Enea ;
Frenkel, Charlotte ;
Shrestha, Sumit Bam ;
Taverni, Gemma ;
Khacef, Lyes ;
Payvand, Melika ;
Donati, Elisa .
FRONTIERS IN NEUROSCIENCE, 2020, 14
[20]   UNIVERSAL APPROXIMATION TO NONLINEAR OPERATORS BY NEURAL NETWORKS WITH ARBITRARY ACTIVATION FUNCTIONS AND ITS APPLICATION TO DYNAMICAL-SYSTEMS [J].
CHEN, TP ;
CHEN, H .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (04) :911-917