Noise conditions for prespecified convergence rates of stochastic approximation algorithms

被引:8
作者
Chong, EKP [1 ]
Wang, IJ
Kulkarni, SR
机构
[1] Purdue Univ, Sch Elect & Comp Engn, W Lafayette, IN 47907 USA
[2] Johns Hopkins Univ, Appl Phys Lab, Res & Technol Dev Ctr, Laurel, MD 20723 USA
[3] Princeton Univ, Dept Elect Engn, Princeton, NJ 08544 USA
基金
美国国家科学基金会;
关键词
convergence rate; Kiefer-Wolfowitz; necessary and sufficient conditions; noise sequences; Robbins-Monro; stochastic approximation;
D O I
10.1109/18.749035
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We develop deterministic necessary and sufficient conditions on individual noise sequences of a stochastic approximation algorithm for the error of the iterates to converge at a given fate. Specifically, suppose {rho(n)} is a given positive sequence converging monotonically to zero. Consider a stochastic approximation algorithm x(n+1) = x(n) - a(n) (A(n) x(n) - b(n)) + a(n) e(n), where {x(n)} is the iterate sequence, {a(n)} is the step size sequence, {e(n)} is the noise sequence, and x* is the desired zero of the function f(x) = Ax - b. Then, under appropriate assumptions, we show that x(n) - x* = o(rho(n)) if and only if the sequence {e(n)} satisfies one of five equivalent conditions. These conditions are based on well-known formulas for noise sequences: Kushner and Clark's condition, Chen's condition, Kulkarni and Horn's condition, a decomposition condition, and a weighted averaging condition. Our necessary and sufficient condition on {e(n)} to achieve a convergence rate of {rho(n)} is basically that the sequence {e(n)/rho(n)} satisfies any one of the above five well-known conditions. We provide examples to illustrate our result. In particular, we easily recover the familiar result that if a(n) = a/n and {e(n)} is a martingale difference process with bounded variance, then x(n) - x* = o(n(-1/2)(log (n))(beta)) for any beta >1/2.
引用
收藏
页码:810 / 814
页数:5
相关论文
共 21 条