Properly-Weighted Graph Laplacian for Semi-supervised Learning

被引:29
作者
Calder, Jeff [1 ]
Slepcev, Dejan [2 ]
机构
[1] Univ Minnesota, Dept Math, Minneapolis, MN 55455 USA
[2] Carnegie Mellon Univ, Dept Math Sci, Pittsburgh, PA 15213 USA
关键词
Semi-supervised learning; Label propagation; Asymptotic consistency; PDEs on graphs; Gamma-convergence; P-LAPLACIAN; REGULARIZATION; CLASSIFICATION; CONVERGENCE; RANKING;
D O I
10.1007/s00245-019-09637-3
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The performance of traditional graph Laplacian methods for semi-supervised learning degrades substantially as the ratio of labeled to unlabeled data decreases, due to a degeneracy in the graph Laplacian. Several approaches have been proposed recently to address this, however we show that some of them remain ill-posed in the large-data limit. In this paper, we show a way to correctly set the weights in Laplacian regularization so that the estimator remains well posed and stable in the large-sample limit. We prove that our semi-supervised learning algorithm converges, in the infinite sample size limit, to the smooth solution of a continuum variational problem that attains the labeled values continuously. Our method is fast and easy to implement.
引用
收藏
页码:1111 / 1159
页数:49
相关论文
共 55 条
  • [1] ON OPTIMAL MATCHINGS
    AJTAI, M
    KOMLOS, J
    TUSNADY, G
    [J]. COMBINATORICA, 1984, 4 (04) : 259 - 264
  • [2] Alamgir Morteza, 2011, Advances in Neural Information Processing Systems, P379
  • [3] Ando R. K., 2007, Advances in neural information processing systems, P25
  • [4] [Anonymous], 2009, GRUNDLEHREN MATH WIS
  • [5] [Anonymous], 2006, IEEE T NEURAL NETWOR
  • [6] [Anonymous], 2013, 11 WORKSH MIN LEARN
  • [7] [Anonymous], 2008, OPTIMAL CONTROL VISC
  • [8] [Anonymous], 2011, the 17th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
  • [9] [Anonymous], 2009, NEURAL INFORM PROCES
  • [10] [Anonymous], 1993, PROGR NONLINEAR DIFF