Data segmentation based on the local intrinsic dimension

被引:0
作者
Michele Allegra
Elena Facco
Francesco Denti
Alessandro Laio
Antonietta Mira
机构
[1] Aix Marseille Université,Institut de Neurosciences de la Timone UMR 7289
[2] CNRS,undefined
[3] Scuola Internazionale Superiore di Studi Avanzati,undefined
[4] University of California,undefined
[5] International Centre for Theoretical Physics,undefined
[6] Università della Svizzera italiana,undefined
[7] Università dell’Insubria,undefined
来源
Scientific Reports | / 10卷
关键词
D O I
暂无
中图分类号
学科分类号
摘要
One of the founding paradigms of machine learning is that a small number of variables is often sufficient to describe high-dimensional data. The minimum number of variables required is called the intrinsic dimension (ID) of the data. Contrary to common intuition, there are cases where the ID varies within the same data set. This fact has been highlighted in technical discussions, but seldom exploited to analyze large data sets and obtain insight into their structure. Here we develop a robust approach to discriminate regions with different local IDs and segment the points accordingly. Our approach is computationally efficient and can be proficiently used even on large data sets. We find that many real-world data sets contain regions with widely heterogeneous dimensions. These regions host points differing in core properties: folded versus unfolded configurations in a protein molecular dynamics trajectory, active versus non-active regions in brain imaging data, and firms with different financial risk in company balance sheets. A simple topological feature, the local ID, is thus sufficient to achieve an unsupervised segmentation of high-dimensional data, complementary to the one given by clustering algorithms.
引用
收藏
相关论文
共 45 条
  • [1] Roweis ST(2000)Nonlinear dimensionality reduction by locally linear embedding Science 290 2323-2326
  • [2] Saul LK(2000)A global geometric framework for nonlinear dimensionality reduction Science 290 2319-2323
  • [3] Tenenbaum JB(2012)Novel high intrinsic dimensionality estimators Mach. Learn. 89 37-65
  • [4] De Silva V(2010)On local intrinsic dimension estimation and its applications IEEE Trans. Signal Process. 58 650-663
  • [5] Langford JC(2015)Low bias local intrinsic dimension estimation from expected simplex skewness IEEE Trans. Pattern Anal. Mach. Intell. 37 196-202
  • [6] Rozza A(2008)Translated poisson mixture model for stratification learning Int. J. Comput. Vis. 80 358-374
  • [7] Lombardi G(2011)Subspace clustering IEEE Signal Process. Mag. 28 52-68
  • [8] Ceruti C(2013)Sparse subspace clustering: algorithm, theory, and applications IEEE Trans. Pattern Anal. Mach. Intell. 35 2765-2781
  • [9] Casiraghi E(2017)Dynamical proxies of north atlantic predictability and extremes Sci. Rep. 7 41278-792
  • [10] Campadelli P(1997)On bayesian analysis of mixtures with an unknown number of components (with discussion) J. R. Stat. Soc. Ser. B Stat. Methodol. 59 731-174