Manifold Learning with Arbitrary Norms

被引:11
作者
Kileel, Joe [1 ,2 ]
Moscovich, Amit [3 ]
Zelesko, Nathan [4 ]
Singer, Amit [5 ,6 ]
机构
[1] Univ Texas Austin, Dept Math, Austin, TX 78712 USA
[2] Univ Texas Austin, Oden Inst, Austin, TX 78712 USA
[3] Tel Aviv Univ, Dept Stat & Operat Res, Tel Aviv, Israel
[4] Brown Univ, Dept Math, Providence, RI 02912 USA
[5] Princeton Univ, Dept Math, Princeton, NJ 08544 USA
[6] Princeton Univ, Program Appl & Computat Math, Princeton, NJ 08544 USA
关键词
Dimensionality reduction; Diffusion maps; Laplacian eigenmaps; Second-order differential operator; Riemannian geometry; Convex body; NONLINEAR DIMENSIONALITY REDUCTION; CRYO-EM; LAPLACIAN; CONVERGENCE; CONSISTENCY; GRAPH; REGRESSION; EIGENMAPS;
D O I
10.1007/s00041-021-09879-2
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Manifold learning methods play a prominent role in nonlinear dimensionality reduction and other tasks involving high-dimensional data sets with low intrinsic dimensionality. Many of these methods are graph-based: they associate a vertex with each data point and a weighted edge with each pair. Existing theory shows that the Laplacian matrix of the graph converges to the Laplace-Beltrami operator of the data manifold, under the assumption that the pairwise affinities are based on the Euclidean norm. In this paper, we determine the limiting differential operator for graph Laplacians constructed using any norm. Our proof involves an interplay between the second fundamental form of the manifold and the convex geometry of the given norm's unit ball. To demonstrate the potential benefits of non-Euclidean norms in manifold learning, we consider the task of mapping the motion of large molecules with continuous variability. In a numerical simulation we show that a modified Laplacian eigenmaps algorithm, based on the Earthmover's distance, outperforms the classic Euclidean Laplacian eigenmaps, both in terms of computational cost and the sample size needed to recover the intrinsic geometry.
引用
收藏
页数:56
相关论文
共 73 条
  • [1] [Anonymous], 2008, STURM LIOUVILLE THEO
  • [2] [Anonymous], 2008, Computer Vision and Pattern Recognition
  • [3] [Anonymous], 2005, Adv Neural Inf Process Syst, DOI DOI 10.48550/ARXIV.MATH/0506090
  • [4] The embedding dimension of Laplacian eigenfunction maps
    Bates, Jonathan
    [J]. APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2014, 37 (03) : 516 - 530
  • [5] Semi-supervised learning on Riemannian manifolds
    Belkin, M
    Niyogi, P
    [J]. MACHINE LEARNING, 2004, 56 (1-3) : 209 - 239
  • [6] Laplacian eigenmaps for dimensionality reduction and data representation
    Belkin, M
    Niyogi, P
    [J]. NEURAL COMPUTATION, 2003, 15 (06) : 1373 - 1396
  • [7] Belkin M., 2007, Advances in Neural Information Processing Systems, P129
  • [8] Towards a theoretical foundation for Laplacian-based manifold methods
    Belkin, Mikhail
    Niyogi, Partha
    [J]. JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 2008, 74 (08) : 1289 - 1308
  • [9] Bellet A., 2015, Synthesis Lectures on Artificial Intelligence and Machine Learning, V9, P1
  • [10] Bendory T, 2020, IEEE SIGNAL PROC MAG, V37, P58, DOI [10.1109/MSP.2019.2957822, 10.1109/msp.2019.2957822]