Image Modeling with Deep Convolutional Gaussian Mixture Models

被引:0
作者
Gepperth, Alexander [1 ]
Pfuelb, Benedikt [1 ]
机构
[1] Fulda Univ Appl Sci, Fulda, Germany
来源
2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) | 2021年
关键词
Deep Learning; Gaussian Mixture Models; Deep Convolutional Gaussian Mixture Models; Stochastic Gradient Descent;
D O I
10.1109/IJCNN52387.2021.9533745
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this conceptual work, we present Deep Convolutional Gaussian Mixture Models (DCGMMs): a new formulation of deep hierarchical Gaussian Mixture Models (GMMs) that is particularly suitable for describing and generating images. Vanilla (i.e., flat) GMMs require a very large number of components to describe images well, leading to long training times and memory issues. DCGMMs avoid this by a stacked architecture of multiple GMM layers, linked by convolution and pooling operations. This allows to exploit the compositionality of images in a similar way as deep CNNs do. DCGMMs can be trained end-to-end by Stochastic Gradient Descent. This sets them apart from vanilla GMMs which are trained by Expectation-Maximization, requiring a prior k-means initialization which is infeasible in a layered structure. For generating sharp images with DCGMMs, we introduce a new gradient-based technique for sampling through non-invertible operations like convolution and pooling. Based on the MNIST and FashionMNIST datasets, we validate the DCGMMs model by demonstrating its superiority over flat GMMs for clustering, sampling and outlier detection.
引用
收藏
页数:9
相关论文
共 50 条
  • [21] Automated Colony Counting Based on Histogram Modeling Using Gaussian Mixture Models
    Maretic, Igor S.
    Lackovic, Igor
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON MEDICAL AND BIOLOGICAL ENGINEERING 2017 (CMBEBIH 2017), 2017, 62 : 548 - 553
  • [22] Bayesian approaches to Gaussian mixture modeling
    Roberts, SJ
    Husmeier, D
    Rezek, I
    Penny, W
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1998, 20 (11) : 1133 - 1142
  • [23] MODEL SELECTION FOR GAUSSIAN MIXTURE MODELS
    Huang, Tao
    Peng, Heng
    Zhang, Kun
    STATISTICA SINICA, 2017, 27 (01) : 147 - 169
  • [24] DS-UI: Dual-Supervised Mixture of Gaussian Mixture Models for Uncertainty Inference in Image Recognition
    Xie, Jiyang
    Ma, Zhanyu
    Xue, Jing-Hao
    Zhang, Guoqiang
    Sun, Jian
    Zheng, Yinhe
    Guo, Jun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 9208 - 9219
  • [25] GAUSSIAN MIXTURE MODELING FOR SOURCE LOCALIZATION
    Flam, John T.
    Jalden, Joakim
    Chatterjee, Saikat
    2011 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2011, : 2604 - 2607
  • [26] Clusterability assessment for Gaussian mixture models
    Nowakowska, Ewa
    Koronacki, Jacek
    Lipovetsky, Stan
    APPLIED MATHEMATICS AND COMPUTATION, 2015, 256 : 591 - 601
  • [27] Parsimonious airfoil Parameterisation: A deep learning framework with Bidirectional LSTM and Gaussian Mixture models
    le Roux, Vincent
    Davel, Marelie H.
    Bosman, Johan
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 255
  • [28] Evolving Gaussian Mixture Models for Classification
    Reichhuber, Simon
    Tomforde, Sven
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 3, 2022, : 964 - 974
  • [29] Optimal Transport for Gaussian Mixture Models
    Chen, Yongxin
    Georgiou, Tryphon T.
    Tannenbaum, Allen
    IEEE ACCESS, 2019, 7 : 6269 - 6278
  • [30] Embedded local adaptive Gaussian mixture models for whole image super-resolution
    Yang, Bo
    Xu, Huaping
    Xie, Xiaozhen
    Amjad, Kinnan
    JOURNAL OF ELECTRONIC IMAGING, 2018, 27 (05)