Learning Generative Models for Active Inference Using Tensor Networks

被引:2
作者
Wauthier, Samuel T. [1 ]
Vanhecke, Bram [2 ,3 ]
Verbelen, Tim [1 ]
Dhoedt, Bart [1 ]
机构
[1] Ghent Univ IMEC, IDLab, Dept Informat Technol, Technol Pk Zwijnaarde 126, B-9052 Ghent, Belgium
[2] Univ Vienna, Fac Phys, Boltzmanngasse 5, A-1090 Vienna, Austria
[3] Univ Vienna, Fac Math Quantum Opt Quantum Nanophys & Quantum I, Boltzmanngasse 5, A-1090 Vienna, Austria
来源
ACTIVE INFERENCE, IWAI 2022 | 2023年 / 1721卷
关键词
Active inference; Tensor networks; Generative modeling;
D O I
10.1007/978-3-031-28719-0_20
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Active inference provides a general framework for behavior and learning in autonomous agents. It states that an agent will attempt to minimize its variational free energy, defined in terms of beliefs over observations, internal states and policies. Traditionally, every aspect of a discrete active inference model must be specified by hand, i.e. by manually defining the hidden state space structure, as well as the required distributions such as likelihood and transition probabilities. Recently, efforts have been made to learn state space representations automatically from observations using deep neural networks. In this paper, we present a novel approach of learning state spaces using quantum physics-inspired tensor networks. The ability of tensor networks to represent the probabilistic nature of quantum states as well as to reduce large state spaces makes tensor networks a natural candidate for active inference. We show how tensor networks can be used as a generative model for sequential data. Furthermore, we show how one can obtain beliefs from such a generative model and how an active inference agent can use these to compute the expected free energy. Finally, we demonstrate our method on the classic T-maze environment.
引用
收藏
页码:285 / 297
页数:13
相关论文
共 13 条
  • [1] Learning Generative State Space Models for Active Inference
    Catal, Ozan
    Wauthier, Samuel
    De Boom, Cedric
    Verbelen, Tim
    Dhoedt, Bart
    [J]. FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2020, 14
  • [2] Tree tensor networks for generative modeling
    Cheng, Song
    Wang, Lei
    Xiang, Tao
    Zhang, Pan
    [J]. PHYSICAL REVIEW B, 2019, 99 (15)
  • [3] Sophisticated Inference
    Friston, Karl
    Da Costa, Lancelot
    Hafner, Danijar
    Hesp, Casper
    Parr, Thomas
    [J]. NEURAL COMPUTATION, 2021, 33 (03) : 713 - 763
  • [4] Active inference and learning
    Friston, Karl
    FitzGerald, Thomas
    Rigoli, Francesco
    Schwartenbeck, Philipp
    O'Doherty, John
    Pezzulo, Giovanni
    [J]. NEUROSCIENCE AND BIOBEHAVIORAL REVIEWS, 2016, 68 : 862 - 879
  • [5] Unsupervised Generative Modeling Using Matrix Product States
    Han, Zhao-Yu
    Wang, Jun
    Fan, Heng
    Wang, Lei
    Zhang, Pan
    [J]. PHYSICAL REVIEW X, 2018, 8 (03):
  • [6] Heins C., 2022, Journal of Open Source Software, V7, P4098, DOI DOI 10.21105/JOSS.04098
  • [7] LeCun Yann, 2010, MNIST handwritten digit database
  • [8] Tensor networks for complex quantum systems
    Orus, Roman
    [J]. NATURE REVIEWS PHYSICS, 2019, 1 (09) : 538 - 550
  • [9] Perez-Garcia D, 2007, QUANTUM INF COMPUT, V7, P401
  • [10] Stoudenmire EM, 2016, ADV NEUR IN, V29