Temporal Coding of Neural Stimuli

被引:0
作者
Horzyk, Adrian [1 ]
Goldon, Krzysztof [1 ]
Starzyk, Janusz A. [2 ,3 ]
机构
[1] AGH Univ Sci & Technol, Krakow, Poland
[2] Univ Informat Technol & Management Rzeszow, Rzeszow, Poland
[3] Ohio Univ, Sch EECS, Athens, OH 45701 USA
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: WORKSHOP AND SPECIAL SESSIONS | 2019年 / 11731卷
关键词
Temporal coding; Temporal neurons; Feature representation in the time space; Stimuli receptor transformation into the time space; Associative temporal neural networks; Associative graph data structure; SPIKING; NETWORKS;
D O I
10.1007/978-3-030-30493-5_56
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Contemporary artificial neural networks use various metrics to code input data and usually do not use temporal coding, unlike biological neural systems. Real neural systems operate in time and use the time to code external stimuli of various kinds to produce a uniform internal data representation that can be used for further neural computations. This paper shows how it can be done using special receptors and neurons which use the time to code external data as well as internal results of computations. If neural processes take different time, the activation time of neurons can be used to code the results of computations. Such neurons can automatically find data associated with the given inputs. In this way, we can find the most similar objects represented by the network and use them for recognition or classification tasks. Conducted research and results prove that time space, temporal coding, and temporal neurons can be used instead of data feature space, direct use of input data, and classic artificial neurons. Time and temporal coding might be an important branch for the development of future artificial neural networks inspired by biological neurons.
引用
收藏
页码:607 / 621
页数:15
相关论文
共 24 条
[1]  
[Anonymous], 2003, The Handbook of Brain Theory and Neural Networks
[2]  
[Anonymous], 2016, DEEP LEARNING
[3]  
Cormen T., 2009, INTRO ALGORITHMS, P484
[4]   Memory Consolidation by Replay of Stimulus-Specific Neural Activity [J].
Deuker, Lorena ;
Olligs, Jan ;
Fell, Juergen ;
Kranz, Thorsten A. ;
Mormann, Florian ;
Montag, Christian ;
Reuter, Martin ;
Elger, Christian E. ;
Axmacher, Nikolai .
JOURNAL OF NEUROSCIENCE, 2013, 33 (49) :19373-19383
[5]  
Duch W, 2005, J MIND BEHAV, V26, P1
[6]   LIDA: A Systems-level Architecture for Cognition, Emotion, and Learning [J].
Franklin, Stan ;
Madl, Tamas ;
D'Mello, Sidney ;
Snaider, Javier .
IEEE TRANSACTIONS ON AUTONOMOUS MENTAL DEVELOPMENT, 2014, 6 (01) :19-41
[7]  
Gerstner W., 2002, SPIKING NEURON MODEL, DOI [DOI 10.1017/CBO9780511815706, 10.1017/CBO9780511815706]
[8]  
Graupe D., 2016, DEEP LEARNING NEURAL, DOI [10.1142/10190, DOI 10.1142/10190]
[9]  
Horzyk A., 2017, KEOD, P67, DOI DOI 10.5220/0006504100670079
[10]   Associative Graph Data Structures Used for Acceleration of K Nearest Neighbor Classifiers [J].
Horzyk, Adrian ;
Goldon, Krzysztof .
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2018, PT I, 2018, 11139 :648-658