Hierarchical Deep Gaussian Processes Latent Variable Model via Expectation Propagation

被引:0
作者
Taubert, Nick [1 ]
Giese, Martin A. [1 ]
机构
[1] Univ Clin Tubingen, Sect Computat Sensomotor, CIN HIH, Otfried Muller Str 25, D-72076 Tubingen, Germany
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT III | 2021年 / 12893卷
基金
欧洲研究理事会;
关键词
Deep GP-LVM; Hierarchical probabilistic model; Dimension reduction; Motion synthesis; Expectation propagation;
D O I
10.1007/978-3-030-86365-4_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gaussian Processes (GPs) and related unsupervised learning techniques such as Gaussian Process Latent Variable Models (GP-LVMs) have been very successful in the accurate modeling of high-dimensional data based on limited amounts of training data. Usually these techniques have the disadvantage of a high computational complexity. This makes it difficult to solve the associated learning problems for complex hierarchical models and large data sets, since the related computations, as opposed to neural networks, are not node-local. Combining sparse approximation techniques for GPs and Power Expectation Propagation, we present a framework for the computationally efficient implementation of hierarchical deep Gaussian process (latent variable) models. We provide implementations of this approach on the GPU as well as on the CPU, and we benchmark efficiency comparing different optimization algorithms. We present the first implementation of such deep hierarchical GP-LVMs and demonstrate the computational efficiency of our GPU implementation.
引用
收藏
页码:317 / 329
页数:13
相关论文
共 31 条
[1]  
[Anonymous], 2013, P ACM S APPL PERC SA
[2]  
Bishop C.M., 2006, Pattern Recognition and Machine Learning, DOI DOI 10.1007/978-0-387-45528-0
[3]   Style machines [J].
Brand, M ;
Hertzmann, A .
SIGGRAPH 2000 CONFERENCE PROCEEDINGS, 2000, :183-192
[4]  
Bui TD, 2017, Efficient deterministic approximate Bayesian inference for Gaussian process models
[5]   Performance animation from low-dimensional control signals [J].
Chai, JX ;
Hodgins, JK .
ACM TRANSACTIONS ON GRAPHICS, 2005, 24 (03) :686-696
[6]  
Dai Zhenwen, 2014, ARXIV14104984
[7]  
Damianou A., 2013, Deep Gaussian Processes, P207, DOI DOI 10.48550/ARXIV.1211.0358
[8]  
Grassia FS, 1998, J. Graph. Tools, V3, P29, DOI DOI 10.1080/10867651.1998.10487493
[9]   Style-based inverse kinematics [J].
Grochow, K ;
Martin, SL ;
Hertzmann, A ;
Popovic, Z .
ACM TRANSACTIONS ON GRAPHICS, 2004, 23 (03) :522-531
[10]  
Harvey F.G., 2019, RECURRENT TRANSITION