Probabilistic Spiking Neural Network for Robotic Tactile Continual Learning

被引:2
作者
Fang, Senlin [1 ,2 ]
Liu, Yiwen [2 ]
Liu, Chengliang [2 ]
Wang, Jingnan [2 ]
Su, Yuanzhe [2 ]
Zhang, Yupo [2 ]
Kong, Hoiio [1 ]
Yi, Zhengkun [2 ]
Wu, Xinyu [2 ]
机构
[1] City Univ Macau, Macau 999078, Peoples R China
[2] Shenzhen Inst Adv Technol, Inst Artificial Intelligence & Robot Soc, SIAT Branch, Dept Intelligent Syst & Robot Learning Lab,ISRL G, Shenzhen 518055, Peoples R China
来源
2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2024 | 2024年
基金
中国国家自然科学基金;
关键词
D O I
10.1109/ICRA57147.2024.10610553
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The sense of touch is essential for robots to perform various daily tasks. Artificial Neural Networks have shown significant promise in advancing robotic tactile learning. However, due to the changing of tactile data distribution as robots encounter new tasks, ANN-based robotic tactile learning suffers from catastrophic forgetting. To solve this problem, we introduce a novel continual learning (CL) framework called the Probabilistic Spiking Neural Network with Variational Continual Learning (PSNN-VCL). In this framework, PSNN introduces uncertainty during spike emission and can apply fast Variational Inference by optimizing the uncertainty through backpropagation, which significantly reduces the required model parameters for VCL. We establish a robotic tactile CL benchmark using publicly available datasets to evaluate our method. Experimental results demonstrated that, compared to other CL methods, PSNN-VCL not only achieves superior performance in terms of widely used CL metrics but also achieves at least a 50% reduction in model parameters on the robotic tactile CL benchmark.
引用
收藏
页码:530 / 536
页数:7
相关论文
共 29 条
[1]   Continual Predictive Learning from Videos [J].
Chen, Geng ;
Zhang, Wendong ;
Lu, Han ;
Gao, Siyu ;
Wang, Yunbo ;
Long, Mingsheng ;
Yang, Xiaokang .
2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, :10718-10727
[2]   Continual Learning for Affective Robotics: A Proof of Concept for Wellbeing [J].
Churamani, Nikhil ;
Axelsson, Minja ;
Caldir, Atahan ;
Gunes, Hatice .
2022 10TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS, ACIIW, 2022,
[3]  
Ebrahimi Sayna, 2020, INT C LEARN REPR
[4]   TactONet: Tactile Ordinal Network Based on Unimodal Probability for Object Hardness Classification [J].
Fang, Senlin ;
Yi, Zhengkun ;
Mi, Tingting ;
Zhou, Zhenning ;
Ye, Chaoxiang ;
Shang, Wanfeng ;
Xu, Tiantian ;
Wu, Xinyu .
IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2023, 20 (04) :2784-2794
[5]   Human-Inspired Neurorobotic System for Classifying Surface Textures by Touch [J].
Friedl, Ken E. ;
Voelker, Aaron R. ;
Peer, Angelika ;
Eliasmith, Chris .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2016, 1 (01) :516-523
[6]  
Ghahramani Z., 2000, NIPS WORKSH ONL LEAR
[7]   TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile Object Recognition [J].
Gu, Fuqiang ;
Sng, Weicong ;
Taunyazov, Tasbolat ;
Soh, Harold .
2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, :9876-9882
[8]   The tempotron:: a neuron that learns spike timing-based decisions [J].
Gütig, R ;
Sompolinsky, H .
NATURE NEUROSCIENCE, 2006, 9 (03) :420-428
[9]   A QUANTITATIVE DESCRIPTION OF MEMBRANE CURRENT AND ITS APPLICATION TO CONDUCTION AND EXCITATION IN NERVE [J].
HODGKIN, AL ;
HUXLEY, AF .
JOURNAL OF PHYSIOLOGY-LONDON, 1952, 117 (04) :500-544
[10]   Simple model of spiking neurons [J].
Izhikevich, EM .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (06) :1569-1572