An evolutionary sampling approach with adaptive Levy probability distribution

被引:0
作者
Hu, Chaoru [1 ]
Sun, Jun [1 ]
Na, Tian [2 ]
机构
[1] Jiangnan Univ, Sch IoT Engn, Wuxi 214122, Jiangsu, Peoples R China
[2] Jiangnan Univ, Sch Humanities, Wuxi 214122, Jiangsu, Peoples R China
来源
2016 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC) | 2016年
关键词
SEQUENTIAL MONTE-CARLO; ENCLOSING BALL; DENSITY; MACHINE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In classical learning methods, sampling is a process of acquiring training data, which can select the representative samples from the original data and offer a solution to some intractable optimization problems. In this paper, we investigate how to acquire accurate samples from an arbitrary distribution. Based on our previous work and the principles of evolutionary computation, we propose an improved evolutionary sampling approach (ES). In order to obtain the performance of the ES, we propose a general method to set the smoothing parameter and thus employ an adaptive Levy probability distribution to control the parameter in the Markov chain. In the whole evolution sampling process, a support sample model (SSM) is firstly built to approximate an original density function, and then the proposed evolutionary sampling approach can converge to the optimal solution by minimizing the total distance through seeking the accurate model parameters of the SSM. The proposed algorithm is applied to some probability density functions with different numbers of the peaks to obtain optimal sample set that can represent the solutions of related problems. The experimental results show that the proposed variant of evolutionary sampling approach obtains accurate sample set with high efficiency, and that the method can be suitable for any other machine learning problems that can be transformed into density function estimation problems with a probabilistic framework.
引用
收藏
页码:1991 / 1997
页数:7
相关论文
共 22 条
  • [1] General support vector representation machine for one-class classification of non-stationary classes
    Camci, Fatih
    Chinnam, Ratna Babu
    [J]. PATTERN RECOGNITION, 2008, 41 (10) : 3021 - 3034
  • [2] An overview of existing methods and recent advances in sequential Monte Carlo
    Cappe, Olivier
    Godsill, Simon J.
    Moulines, Eric
    [J]. PROCEEDINGS OF THE IEEE, 2007, 95 (05) : 899 - 924
  • [3] From Minimum Enclosing Ball to Fast Fuzzy Inference System Training on Large Datasets
    Chung, Fu-Lai
    Deng, Zhaohong
    Wang, Shitong
    [J]. IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2009, 17 (01) : 173 - 184
  • [4] DEHEUVELS P, 1977, REV STATISTIQUE APPL, V25, P5
  • [5] Sequential Monte Carlo samplers
    Del Moral, Pierre
    Doucet, Arnaud
    Jasra, Ajay
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2006, 68 : 411 - 436
  • [6] FRSDE: Fast reduced set density estimator using minimal enclosing ball approximation
    Deng, Zhaohong
    Chung, Fu-Lai
    Wang, Shitong
    [J]. PATTERN RECOGNITION, 2008, 41 (04) : 1363 - 1372
  • [7] Devroye L., 1986, NonUniform Random Variate Generation
  • [8] Efficient block sampling strategies for sequential Monte Carlo methods
    Doucet, Arnaud
    Briers, Mark
    Stephane, Senecal
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2006, 15 (03) : 693 - 711
  • [9] Gnedenko BV., 1954, AM J MATH, V105, P28
  • [10] A brief survey of bandwidth selection for density estimation
    Jones, MC
    Marron, JS
    Sheather, SJ
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1996, 91 (433) : 401 - 407