Principal component analysis (PCA) is one of the most commonly used statistical procedures with a wide range of applications. Consider the points X-1, X-2 , ..., X-n are vectors drawn i.i.d. from a distribution with mean zero and covariance Sigma, where Sigma is unknown. Let A(n) = XnXnT, then E[A(n)] = Sigma. This paper considers the problem of finding the smallest eigenvalue and eigenvector of matrix Sigma. A classical estimator of this type is due to (Krasulina, 1969). We are going to state the convergence proof of Krasulina for the smallest eigenvalue and corresponding eigenvector, and then find their convergence rate. (C) 2019 Elsevier B.V. All rights reserved.
机构:
Guangdong Univ Technol, Fac Appl Math, Guangzhou 510090, Guangdong, Peoples R ChinaS China Univ Technol, Sch Business Adm, Guangzhou 510641, Guangdong, Peoples R China
Sun, Lin
Liu, Youzhu
论文数: 0引用数: 0
h-index: 0
机构:
S China Univ Technol, Sch Business Adm, Guangzhou 510641, Guangdong, Peoples R ChinaS China Univ Technol, Sch Business Adm, Guangzhou 510641, Guangdong, Peoples R China
Liu, Youzhu
Xu, Weijun
论文数: 0引用数: 0
h-index: 0
机构:
S China Univ Technol, Sch Business Adm, Guangzhou 510641, Guangdong, Peoples R ChinaS China Univ Technol, Sch Business Adm, Guangzhou 510641, Guangdong, Peoples R China
Xu, Weijun
Xiao, Weilin
论文数: 0引用数: 0
h-index: 0
机构:
S China Univ Technol, Sch Business Adm, Guangzhou 510641, Guangdong, Peoples R China
Zhejiang Univ, Sch Management, Hangzhou 310006, Zhejiang, Peoples R ChinaS China Univ Technol, Sch Business Adm, Guangzhou 510641, Guangdong, Peoples R China