Sparse Multivariate Gaussian Mixture Regression

被引:7
作者
Weruaga, Luis [1 ]
Via, Javier [2 ]
机构
[1] Khalifa Univ Sci Technol & Res, Sharjah 127788, U Arab Emirates
[2] Univ Cantabria, Dept Commun Engn, E-39005 Santander, Spain
关键词
Function approximation; Gaussian function mixture (GFM); logarithmic utility function; regression; sparsity; BASIS NEURAL-NETWORKS; ALGORITHM;
D O I
10.1109/TNNLS.2014.2334596
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Fitting a multivariate Gaussian mixture to data represents an attractive, as well as challenging problem, in especial when sparsity in the solution is demanded. Achieving this objective requires the concurrent update of all parameters (weight, centers, and precisions) of all multivariate Gaussian functions during the learning process. Such is the focus of this paper, which presents a novel method founded on the minimization of the error of the generalized logarithmic utility function (GLUF). This choice, which allows us to move smoothly from the mean square error (MSE) criterion to the one based on the logarithmic error, yields an optimization problem that resembles a locally convex problem and can be solved with a quasi-Newton method. The GLUF framework also facilitates the comparative study between both extremes, concluding that the classical MSE optimization is not the most adequate for the task. The performance of the proposed novel technique is demonstrated on simulated as well as realistic scenarios.
引用
收藏
页码:1098 / 1108
页数:11
相关论文
共 50 条
  • [41] Inference for elliptical copula multivariate response regression models
    Zhao, Yue
    Genest, Christian
    ELECTRONIC JOURNAL OF STATISTICS, 2019, 13 (01): : 911 - 984
  • [42] Robust and sparse logistic regression
    Cornilly, Dries
    Tubex, Lise
    Van Aelst, Stefan
    Verdonck, Tim
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2024, 18 (03) : 663 - 679
  • [43] On Regularized Sparse Logistic Regression
    Zhang, Mengyuan
    Liu, Kai
    23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 1535 - 1540
  • [44] Detection boundary in sparse regression
    Ingster, Yuri I.
    Tsybakov, Alexandre B.
    Verzelen, Nicolas
    ELECTRONIC JOURNAL OF STATISTICS, 2010, 4 : 1476 - 1526
  • [45] Pansharpening via sparse regression
    Tang, Songze
    Xiao, Liang
    Liu, Pengfei
    Huang, Lili
    Zhou, Nan
    Xu, Yang
    OPTICAL ENGINEERING, 2017, 56 (09)
  • [46] Sparse regularized local regression
    Vidaurre, Diego
    Bielza, Concha
    Larranaga, Pedro
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2013, 62 : 122 - 135
  • [47] Sparse regression with exact clustering
    She, Yiyuan
    ELECTRONIC JOURNAL OF STATISTICS, 2010, 4 : 1055 - 1096
  • [48] Acceleration in Distributed Sparse Regression
    Maros, Marie
    Scutari, Gesualdo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [49] Globally Sparse PLS Regression
    Liu, Tzu-Yu
    Trinchera, Laura
    Tenenhaus, Arthur
    Wei, Dennis
    Hero, Alfred O.
    NEW PERSPECTIVES IN PARTIAL LEAST SQUARES AND RELATED METHODS, 2013, 56 : 117 - 127
  • [50] Sparse Linear Identifiable Multivariate Modeling
    Henao, Ricardo
    Winther, Ole
    JOURNAL OF MACHINE LEARNING RESEARCH, 2011, 12 : 863 - 905