Globally convergent stochastic optimization with optimal asymptotic distribution

被引:6
作者
Dippon, J [1 ]
机构
[1] Univ Stuttgart, Inst Math A, D-70511 Stuttgart, Germany
关键词
stochastic approximation; global stochastic optimization; averaging; gradient descent; consistency; central limit theorem; M-estimator; maximum likelihood estimation; regression analysis; artificial neural networks;
D O I
10.1017/S0021900200015023
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
A stochastic gradient descent method is combined with a consistent auxiliary estimate to achieve global convergence of the recursion. Using step lengths converging to zero slower than 1/n and averaging the trajectories, yields the optimal convergence rate of 1/root n and the optimal variance of the asymptotic distribution. Possible applications can be found in maximum likelihood estimation, regression analysis, training of artificial neural networks, and stochastic optimization.
引用
收藏
页码:395 / 406
页数:12
相关论文
共 28 条
[1]  
[Anonymous], 2005, NEURAL NETWORKS PATT
[3]   STOCHASTIC-APPROXIMATION OF GLOBAL MINIMUM POINTS [J].
DIPPON, J ;
FABIAN, V .
JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 1994, 41 (03) :327-347
[4]   ON ASYMPTOTIC NORMALITY IN STOCHASTIC APPROXIMATION [J].
FABIAN, V .
ANNALS OF MATHEMATICAL STATISTICS, 1968, 39 (04) :1327-&
[6]   UNTITLED - COMMENT [J].
FABIAN, V .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1994, 89 (428) :1571-1571
[7]  
FABIAN V, 1992, 530 RM MICH STAT U D
[8]  
Hertz J., 1991, Introduction to the Theory of Neural Computation
[9]  
Huber P. J., 1981, ROBUST STAT
[10]   ASYMPTOTIC PROPERTIES OF STOCHASTIC APPROXIMATIONS WITH CONSTANT-COEFFICIENTS [J].
KUSHNER, HJ ;
HUANG, H .
SIAM JOURNAL ON CONTROL AND OPTIMIZATION, 1981, 19 (01) :87-105