Gaussian Kernel Width Optimization for Sparse Bayesian Learning

被引:28
|
作者
Mohsenzadeh, Yalda [1 ]
Sheikhzadeh, Hamid [1 ]
机构
[1] Amirkabir Univ Technol, Dept Elect Engn, Tehran 1415854546, Iran
关键词
Adaptive kernel learning (AKL); expectation maximization (EM); kernel width optimization; regression; relevance vector machine (RVM); sparse Bayesian learning; supervised kernel methods; MACHINE; POSE;
D O I
10.1109/TNNLS.2014.2321134
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse kernel methods have been widely used in regression and classification applications. The performance and the sparsity of these methods are dependent on the appropriate choice of the corresponding kernel functions and their parameters. Typically, the kernel parameters are selected using a cross-validation approach. In this paper, a learning method that is an extension of the relevance vector machine (RVM) is presented. The proposed method can find the optimal values of the kernel parameters during the training procedure. This algorithm uses an expectation-maximization approach for updating kernel parameters as well as other model parameters; therefore, the speed of convergence and computational complexity of the proposed method are the same as the standard RVM. To control the convergence of this fully parameterized model, the optimization with respect to the kernel parameters is performed using a constraint on these parameters. The proposed method is compared with the typical RVM and other competing methods to analyze the performance. The experimental results on the commonly used synthetic data, as well as benchmark data sets, demonstrate the effectiveness of the proposed method in reducing the performance dependency on the initial choice of the kernel parameters.
引用
收藏
页码:709 / 719
页数:11
相关论文
共 50 条
  • [1] Adaptive spherical Gaussian kernel in sparse Bayesian learning framework for nonlinear regression
    Yuan, Jin
    Bo, Liefeng
    Wang, Kesheng
    Yu, Tao
    EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (02) : 3982 - 3989
  • [2] Sparse Bayesian Modeling With Adaptive Kernel Learning
    Tzikas, Dimitris G.
    Likas, Aristidis C.
    Galatsanos, Nikolaos P.
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (06): : 926 - 937
  • [3] Sparse Bayesian Learning for non-Gaussian sources
    Porter, Richard
    Tadic, Vladislav
    Achim, Achim
    DIGITAL SIGNAL PROCESSING, 2015, 45 : 2 - 12
  • [4] A new surrogate modeling method combining polynomial chaos expansion and Gaussian kernel in a sparse Bayesian learning framework
    Zhou, Yicheng
    Lu, Zhenzhou
    Cheng, Kai
    INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, 2019, 120 (04) : 498 - 516
  • [5] On the Support Recovery of Jointly Sparse Gaussian Sources via Sparse Bayesian Learning
    Khanna, Saurabh
    Murthy, Chandra R.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (11) : 7361 - 7378
  • [6] Sparse Bayesian Learning Based on Collaborative Neurodynamic Optimization
    Zhou, Wei
    Zhang, Hai-Tao
    Wang, Jun
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (12) : 13669 - 13683
  • [7] An Efficient Sparse Bayesian Learning Algorithm Based on Gaussian-Scale Mixtures
    Zhou, Wei
    Zhang, Hai-Tao
    Wang, Jun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (07) : 3065 - 3078
  • [8] Vector Approximate Message Passing with Sparse Bayesian Learning for Gaussian Mixture Prior
    Ruan, Chengyao
    Zhang, Zaichen
    Jiang, Hao
    Dang, Jian
    Wu, Liang
    Zhang, Hongming
    CHINA COMMUNICATIONS, 2023, 20 (05) : 57 - 69
  • [9] A Mixed Mahalanobis Kernel for Sparse Bayesian Classification
    Tong, Mi
    Qin, Wangchen
    Liu, Fang
    Qi, Quan
    2018 5TH INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND CONTROL ENGINEERING (ICISCE 2018), 2018, : 61 - 65
  • [10] Fast Kernel Distribution Function Estimation and fast kernel density estimation based on sparse Bayesian learning and regularization
    Yin, Xun-Fu
    Hao, Zhi-Feng
    PROCEEDINGS OF 2008 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2008, : 1756 - +