Model Reduction Through Progressive Latent Space Pruning in Deep Active Inference

被引:0
作者
Wauthier, Samuel T. [1 ]
De Boom, Cedric [1 ]
Catal, Ozan [1 ]
Verbelen, Tim [1 ]
Dhoedt, Bart [1 ]
机构
[1] Ghent Univ Imec, Dept Informat Technol, IDLab, Ghent, Belgium
来源
FRONTIERS IN NEUROROBOTICS | 2022年 / 16卷
关键词
active inference; free energy; deep learning; model reduction; generative modeling; SLEEP; DIMENSIONALITY;
D O I
10.3389/fnbot.2022.795846
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Although still not fully understood, sleep is known to play an important role in learning and in pruning synaptic connections. From the active inference perspective, this can be cast as learning parameters of a generative model and Bayesian model reduction, respectively. In this article, we show how to reduce dimensionality of the latent space of such a generative model, and hence model complexity, in deep active inference during training through a similar process. While deep active inference uses deep neural networks for state space construction, an issue remains in that the dimensionality of the latent space must be specified beforehand. We investigate two methods that are able to prune the latent space of deep active inference models. The first approach functions similar to sleep and performs model reduction post hoc. The second approach is a novel method which is more similar to reflection, operates during training and displays "aha" moments when the model is able to reduce latent space dimensionality. We show for two well-known simulated environments that model performance is retained in the first approach and only diminishes slightly in the second approach. We also show that reconstructions from a real world example are indistinguishable before and after reduction. We conclude that the most important difference constitutes a trade-off between training time and model performance in terms of accuracy and the ability to generalize, via minimization of model complexity.
引用
收藏
页数:16
相关论文
共 7 条
  • [1] A NOVEL LAYERWISE PRUNING METHOD FOR MODEL REDUCTION OF FULLY CONNECTED DEEP NEURAL NETWORKS
    Mauch, Lukas
    Yang, Bin
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 2382 - 2386
  • [2] Application of Active Subspaces for Model Reduction and Identification of Design Space
    Kucherenko, Sergei
    Shah, Nilay
    Zaccheus, Oluyemi
    LARGE-SCALE SCIENTIFIC COMPUTATIONS, LSSC 2023, 2024, 13952 : 412 - 418
  • [3] Optimizing Deep Learning Inference on Embedded Systems Through Adaptive Model Selection
    Marco, Vicent Sanz
    Taylor, Ben
    Wang, Zheng
    Elkhatib, Yehia
    ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2020, 19 (01)
  • [4] A deep learning latent variable model to identify children with autism through motor abnormalities
    Milano, Nicola
    Simeoli, Roberta
    Rega, Angelo
    Marocco, Davide
    FRONTIERS IN PSYCHOLOGY, 2023, 14
  • [5] Cost-Effective Testing of a Deep Learning Model through Input Reduction
    Zhou, Jianyi
    Li, Feng
    Dong, Jinhao
    Zhang, Hongyu
    Hao, Dan
    2020 IEEE 31ST INTERNATIONAL SYMPOSIUM ON SOFTWARE RELIABILITY ENGINEERING (ISSRE 2020), 2020, : 289 - 300
  • [6] Predicting Tumor Spread Through Air Space in the Pulmonary Adenocarcinoma Slides Using Deep Learning Model
    Lee, S.
    Han, Y. B.
    Kim, C. Y.
    Lee, J.
    Kwon, H. J.
    Kim, H.
    Chung, J. -H.
    JOURNAL OF THORACIC ONCOLOGY, 2023, 18 (11) : S174 - S174
  • [7] Deep Learning Model for Reduction COVID-19 Spreading Through Tracking Students' Commitment to Wearing a Face Mask
    Agieb, Ramy Said
    DISTRIBUTED COMPUTING AND OPTIMIZATION TECHNIQUES, ICDCOT 2021, 2022, 903 : 285 - 294