Better scaled local tangent space alignment algorithm

被引:4
作者
Yang, Jian [1 ,2 ]
Li, Fu-Xin [1 ,2 ]
Wang, Jue [1 ]
机构
[1] Institute of Automation, Chinese Academy of Sciences
[2] Graduate School, Chinese Academy of Sciences
来源
Ruan Jian Xue Bao/Journal of Software | 2005年 / 16卷 / 09期
关键词
Dimensionality reduction; Local principal component analysis; Local tangent space alignment; Manifold learning; Principal component analysis; X-means;
D O I
10.1360/jos161584
中图分类号
学科分类号
摘要
Recently, a new manifold learning algorithm, LTSA (local tangent space alignment), has been proposed. It is efficient for many nonlinear dimension reduction problems but unfit for large data sets and newcome data. In this paper, an improved algorithm called partitional local tangent space alignment (PLTSA) is presented, which is based on VQPCA (vector quantization principal component analysis) and LTSA. In the algorithm, the sample space is first divided into overlapping blocks using the X-Means algorithm. Then each block is projected to its local tangent space to get local low-dimensional coordinates of the points in it. At last, the global low-dimensional embedded manifold is obtained by local affine transformations. PLTSA is better than VQPCA in that it gives the global coordinates of the data. It works on a much smaller optimization matrix than that of LTSA and leads to a better-scaled algorithm. The algorithm also provides a set of transformations that allow to calculate the global embedded coordinates of the newcome data. Experiments illustrate the validity of this algorithm.
引用
收藏
页码:1584 / 1590
页数:6
相关论文
共 13 条
[1]  
Seung H.S., Lee D.D., The manifold ways of perception, Science, 290, 5500, pp. 2268-2269, (2000)
[2]  
Donoho D.L., Grimes C., Hessian Eigenmaps: New locally linear embedding techniques for high-dimensional data, Proc. of the National Academy of Sciences of the United States of American, 100, 10, pp. 5591-5596, (2003)
[3]  
Tenenbaum J., Silva V.D., Langford J., A global geometric framework for nonlinear dimensionality reduction, Science, 290, 5500, pp. 2319-2323, (2000)
[4]  
Roweis S., Saul L., Nonlinear dimensionality reduction by locally linear embedding, Science, 290, 5500, pp. 2323-2326, (2000)
[5]  
Belkin M., Niyogi P., Laplacian Eigenmaps for dimensionality reduction and data representation, Neural Computation, 15, 6, pp. 1373-1396, (2003)
[6]  
Min W.L., Lu L., He X.F., Locality pursuit embedding, Pattern Recognition, 37, 4, pp. 781-788, (2004)
[7]  
Zhang Z.Y., Zha H.Y., Principal manifolds and nonlinear dimensionality reduction via tangent space alignment, SIAM Journal of Scientific Computing, 26, 1, pp. 313-338, (2004)
[8]  
Kambhatla N., Leen T.K., Dimension reduction by local principal component analysis, Neural Computation, 9, 7, pp. 1493-1516, (1997)
[9]  
Pelleg D., Moore A., X-means: Extending K-means with efficient estimation of the number of clusters, Proc. of the 17th Int'l Conf. on Machine Learning, pp. 727-734, (2000)
[10]  
Chen W.H., An Introduction to Differentiable Manifold, (2001)