Variational Hierarchical Mixtures for Probabilistic Learning of Inverse Dynamics

被引:0
|
作者
Abdulsamad, Hany [1 ]
Nickl, Peter [2 ]
Klink, Pascal [3 ]
Peters, Jan [3 ]
机构
[1] Aalto Univ, Dept Elect Engn & Automat, Espoo 02150, Finland
[2] RIKEN Ctr Adv Intelligence Project, Chuo City 1030027, Japan
[3] Tech Univ Darmstadt, Dept Comp Sci, D-64289 Darmstadt, Germany
基金
欧盟地平线“2020”;
关键词
Bayes methods; Data models; Computational modeling; Uncertainty; Mixture models; Manipulator dynamics; Neural networks; Dirichlet process mixtures; generative models; hierarchical local regression; inverse dynamics control; SAMPLING METHODS; INFERENCE; NETWORKS; MODELS;
D O I
10.1109/TPAMI.2023.3314670
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Well-calibrated probabilistic regression models are a crucial learning component in robotics applications as datasets grow rapidly and tasks become more complex. Unfortunately, classical regression models are usually either probabilistic kernel machines with a flexible structure that does not scale gracefully with data or deterministic and vastly scalable automata, albeit with a restrictive parametric form and poor regularization. In this paper, we consider a probabilistic hierarchical modeling paradigm that combines the benefits of both worlds to deliver computationally efficient representations with inherent complexity regularization. The presented approaches are probabilistic interpretations of local regression techniques that approximate nonlinear functions through a set of local linear or polynomial units. Importantly, we rely on principles from Bayesian nonparametrics to formulate flexible models that adapt their complexity to the data and can potentially encompass an infinite number of components. We derive two efficient variational inference techniques to learn these representations and highlight the advantages of hierarchical infinite local regression models, such as dealing with non-smooth functions, mitigating catastrophic forgetting, and enabling parameter sharing and fast predictions. Finally, we validate this approach on large inverse dynamics datasets and test the learned models in real-world control scenarios.
引用
收藏
页码:1950 / 1963
页数:14
相关论文
共 50 条
  • [1] A Variational Infinite Mixture for Probabilistic Inverse Dynamics Learning
    Abdulsamad, Hany
    Nickl, Peter
    Klink, Pascal
    Peters, Jan
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 4216 - 4222
  • [2] Memorized Variational Continual Learning for Dirichlet Process Mixtures
    Yang, Yang
    Chen, Bo
    Liu, Hongwei
    IEEE ACCESS, 2019, 7 : 150851 - 150862
  • [3] Clustering Hidden Markov Models With Variational Bayesian Hierarchical EM
    Lan, Hui
    Liu, Ziquan
    Hsiao, Janet H.
    Yu, Dan
    Chan, Antoni B.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (03) : 1537 - 1551
  • [4] Hierarchical Decompositional Mixtures of Variational Autoencoders
    Tan, Ping Liang
    Peharz, Robert
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [5] Batch and online variational learning of hierarchical Dirichlet process mixtures of multivariate Beta distributions in medical applications
    Manouchehri, Narges
    Bouguila, Nizar
    Fan, Wentao
    PATTERN ANALYSIS AND APPLICATIONS, 2021, 24 (04) : 1731 - 1744
  • [6] Probabilistic hierarchical forecasting with deep Poisson mixtures
    Olivares, Kin G.
    Meetei, O. Nganba
    Ma, Ruijun
    Reddy, Rohan
    Cao, Mengfei
    Dicker, Lee
    INTERNATIONAL JOURNAL OF FORECASTING, 2024, 40 (02) : 470 - 489
  • [7] Probabilistic Solar Irradiation Forecasting Based on Variational Bayesian Inference With Secure Federated Learning
    Zhang, Xiaoning
    Fang, Fang
    Wang, Jiaqi
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2021, 17 (11) : 7849 - 7859
  • [8] Batch and Online Variational Learning of Hierarchical Pitman-Yor Mixtures of Multivariate Beta Distributions
    Manouchehri, Narges
    Bouguila, Nizar
    Fan, Wentao
    20TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2021), 2021, : 298 - 303
  • [9] Learning hierarchical probabilistic logic programs
    Fadja, Arnaud Nguembang
    Riguzzi, Fabrizio
    Lamma, Evelina
    MACHINE LEARNING, 2021, 110 (07) : 1637 - 1693
  • [10] Batch and online variational learning of hierarchical Dirichlet process mixtures of multivariate Beta distributions in medical applications
    Narges Manouchehri
    Nizar Bouguila
    Wentao Fan
    Pattern Analysis and Applications, 2021, 24 : 1731 - 1744