Accretionary Learning With Deep Neural Networks With Applications

被引:1
作者
Wei, Xinyu [1 ]
Juang, Biing-Hwang [1 ]
Wang, Ouya [2 ]
Zhou, Shenglong [3 ]
Li, Geoffrey Ye [2 ]
机构
[1] Georgia Inst Technol, Sch Elect & Comp Engn, Atlanta, GA 30332 USA
[2] Imperial Coll London, Dept Elect & Elect Engn, London SW7 2BX, England
[3] Beijing Jiaotong Univ, Sch Math & Stat, Beijing, Peoples R China
关键词
Artificial neural networks; Data models; Knowledge engineering; Task analysis; Training; Speech recognition; Learning systems; Deep learning; accretion learning; deep neural networks; pattern recognition; wireless communications; CLASSIFICATION;
D O I
10.1109/TCCN.2023.3342454
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
One of the fundamental limitations of Deep Neural Networks (DNN) is their inability to acquire and accumulate new cognitive capabilities in an incremental or progressive manner. When data appear from object classes not among the learned ones, a conventional DNN would not be able to recognize them due to the fundamental formulation that it assumes. A typical solution is to re-design and re-learn a new network, most likely an expanded one, for the expanded set of object classes. This process is quite different from that of a human learner. In this paper, we propose a new learning method named Accretionary Learning (AL) to emulate human learning, in that the set of object classes to be recognized need not be fixed, meaning it can grow as the situation arises without requiring an entire redesign of the system. The proposed learning structure is modularized, and can dynamically expand to learn and register new knowledge, as the set of objects grows in size. AL does not forget previous knowledge when learning new data classes. We show that the structure and its learning methodology lead to a system that can grow to cope with increased cognitive complexity while providing stable and superior overall performance.
引用
收藏
页码:660 / 673
页数:14
相关论文
共 50 条
[21]   A Survey on Attacks and Their Countermeasures in Deep Learning: Applications in Deep Neural Networks, Federated, Transfer, and Deep Reinforcement Learning [J].
Ali, Haider ;
Chen, Dian ;
Harrington, Matthew ;
Salazar, Nathaniel ;
Al Ameedi, Mohannad ;
Khan, Ahmad Faraz ;
Butt, Ali R. ;
Cho, Jin-Hee .
IEEE ACCESS, 2023, 11 :120095-120130
[22]   Power Law in Deep Neural Networks: Sparse Network Generation and Continual Learning With Preferential Attachment [J].
Feng, Fan ;
Hou, Lu ;
She, Qi ;
Chan, Rosa H. M. ;
Kwok, James T. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) :8999-9013
[23]   Dynamically Weighted Balanced Loss: Class Imbalanced Learning and Confidence Calibration of Deep Neural Networks [J].
Fernando, K. Ruwani M. ;
Tsokos, Chris P. .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (07) :2940-2951
[24]   Deep Learning with Darwin: Evolutionary Synthesis of Deep Neural Networks [J].
Mohammad Javad Shafiee ;
Akshaya Mishra ;
Alexander Wong .
Neural Processing Letters, 2018, 48 :603-613
[25]   Deep Learning with Darwin: Evolutionary Synthesis of Deep Neural Networks [J].
Shafiee, Mohammad Javad ;
Mishra, Akshaya ;
Wong, Alexander .
NEURAL PROCESSING LETTERS, 2018, 48 (01) :603-613
[26]   Watermarking Deep Neural Networks in Image Processing [J].
Quan, Yuhui ;
Teng, Huan ;
Chen, Yixin ;
Ji, Hui .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (05) :1852-1865
[27]   Exploiting Deep Neural Networks as Covert Channels [J].
Pishbin, Hora Saadaat ;
Bidgoly, Amir Jalaly .
IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (04) :2115-2126
[28]   Fast Learning of Deep Neural Networks via Singular Value Decomposition [J].
Cai, Chenghao ;
Ke, Dengfeng ;
Xu, Yanyan ;
Su, Kaile .
PRICAI 2014: TRENDS IN ARTIFICIAL INTELLIGENCE, 2014, 8862 :820-826
[29]   Enabling Explainable Fusion in Deep Learning With Fuzzy Integral Neural Networks [J].
Islam, Muhammad Aminul ;
Anderson, Derek T. ;
Pinar, Anthony J. ;
Havens, Timothy C. ;
Scott, Grant ;
Keller, James M. .
IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2020, 28 (07) :1291-1300
[30]   Structural damage identification based on autoencoder neural networks and deep learning [J].
Pathirage, Chathurdara Sri Nadith ;
Li, Jun ;
Li, Ling ;
Hao, Hong ;
Liu, Wanquan ;
Ni, Pinghe .
ENGINEERING STRUCTURES, 2018, 172 :13-28