ON UNSUPERVISED NEAREST-NEIGHBOR REGRESSION AND ROBUST LOSS FUNCTIONS

被引:2
|
作者
Kramer, Oliver [1 ]
机构
[1] Carl von Ossietzky Univ Oldenburg, Dept Comp Sci, D-26129 Oldenburg, Germany
来源
ICAART: PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE, VOL 1 | 2012年
关键词
Non-linear dimensionality reduction; Manifold learning; Unsupervised regression; K-nearest neighbour regression; COMPONENT ANALYSIS; DIMENSIONALITY;
D O I
10.5220/0003749301640170
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In many scientific disciplines structures in high-dimensional data have to be detected, e.g., in stellar spectra, in genome data, or in face recognition tasks. We present an approach to non-linear dimensionality reduction based on fitting nearest neighbor regression to the unsupervised regression framework for learning of low-dimensional manifolds. The problem of optimizing latent neighborhoods is difficult to solve, but the UNN formulation allows an efficient strategy of iteratively embedding latent points to fixed neighborhood topologies. The choice of an appropriate loss function is relevant, in particular for noisy, and high-dimensional data spaces. We extend unsupervised nearest neighbor (UNN) regression by the C-insensitive loss, which allows to ignore residuals under a threshold defined by epsilon. In the experimental part of this paper we test the influence of epsilon on the final data space reconstruction error, and present a visualization of UNN embeddings on test data sets.
引用
收藏
页码:164 / 170
页数:7
相关论文
共 50 条