Learning the Nonlinear Geometry of High-Dimensional Data: Models and Algorithms

被引:17
|
作者
Wu, Tong [1 ]
Bajwa, Waheed U. [1 ]
机构
[1] Rutgers State Univ, Dept Elect & Comp Engn, Piscataway, NJ 08854 USA
基金
美国国家科学基金会;
关键词
Data-driven learning; kernel methods; missing data; subspace clustering; union of subspaces; SPARSE REPRESENTATION; COMPONENT ANALYSIS; FACE RECOGNITION; SUBSPACE; REDUCTION; RECOVERY; SIGNALS; UNION; FIT;
D O I
10.1109/TSP.2015.2469637
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Modern information processing relies on the axiom that high-dimensional data lie near low-dimensional geometric structures. This paper revisits the problem of data-driven learning of these geometric structures and puts forth two new nonlinear geometric models for data describing "related" objects/phenomena. The first one of these models straddles the two extremes of the subspace model and the union-of-subspaces model, and is termed the metric-constrained union-of-subspaces (MC-UoS) model. The second one of these models-suited for data drawn from a mixture of nonlinear manifolds-generalizes the kernel subspace model, and is termed the metric-constrained kernel union-of-subspaces (MC-KUoS) model. The main contributions of this paper in this regard include the following. First, it motivates and formalizes the problems of MC-UoS and MC-KUoS learning. Second, it presents algorithms that efficiently learn an MC-UoS or an MC-KUoS underlying data of interest. Third, it extends these algorithms to the case when parts of the data are missing. Last, but not least, it reports the outcomes of a series of numerical experiments involving both synthetic and real data that demonstrate the superiority of the proposed geometric models and learning algorithms over existing approaches in the literature. These experiments also help clarify the connections between this work and the literature on (subspace and kernel k-means) clustering.
引用
收藏
页码:6229 / 6244
页数:16
相关论文
共 50 条
  • [31] Common and independent pathway models in high-dimensional data
    Turkheimer, Eric
    Pettersson, Erik
    BEHAVIOR GENETICS, 2019, 49 (06) : 509 - 509
  • [32] Proportional Odds Models with High-Dimensional Data Structure
    Zahid, Faisal Maqbool
    Tutz, Gerhard
    INTERNATIONAL STATISTICAL REVIEW, 2013, 81 (03) : 388 - 406
  • [33] Non-separable models with high-dimensional data
    Su, Liangjun
    Ura, Takuya
    Zhang, Yichong
    JOURNAL OF ECONOMETRICS, 2019, 212 (02) : 646 - 677
  • [34] Identifying a Minimal Class of Models for High-dimensional Data
    Nevo, Daniel
    Ritov, Ya'acov
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [35] Bootstrap based asymptotic refinements for high-dimensional nonlinear models
    Horowitz, Joel L.
    Rafi, Ahnaf
    JOURNAL OF ECONOMETRICS, 2025, 249
  • [36] Online Markov Blanket Learning for High-Dimensional Data
    Zhaolong Ling
    Bo Li
    Yiwen Zhang
    Ying Li
    Haifeng Ling
    Applied Intelligence, 2023, 53 : 5977 - 5997
  • [37] Flexible High-Dimensional Unsupervised Learning with Missing Data
    Wei, Yuhong
    Tang, Yang
    McNicholas, Paul D.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2020, 42 (03) : 610 - 621
  • [38] Online Markov Blanket Learning for High-Dimensional Data
    Ling, Zhaolong
    Li, Bo
    Zhang, Yiwen
    Li, Ying
    Ling, Haifeng
    APPLIED INTELLIGENCE, 2023, 53 (05) : 5977 - 5997
  • [39] Efficient Sparse Representation for Learning With High-Dimensional Data
    Chen, Jie
    Yang, Shengxiang
    Wang, Zhu
    Mao, Hua
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (08) : 4208 - 4222
  • [40] Scalable collaborative targeted learning for high-dimensional data
    Ju, Cheng
    Gruber, Susan
    Lendle, Samuel D.
    Chambaz, Antoine
    Franklin, Jessica M.
    Wyss, Richard
    Schneeweiss, Sebastian
    van der Laan, Mark J.
    STATISTICAL METHODS IN MEDICAL RESEARCH, 2019, 28 (02) : 532 - 554