Pivotal-Aware Principal Component Analysis

被引:2
作者
Li, Xuelong [1 ,2 ]
Li, Pei [1 ,2 ]
Zhang, Hongyuan [1 ,2 ]
Zhu, Kangjia [2 ,3 ]
Zhang, Rui [2 ,3 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Shaanxi, Peoples R China
[2] Northwestern Polytech Univ, Sch Artificial Intelligence, Opt & Elect iOPEN, Xian 710072, Shaanxi, Peoples R China
[3] Shanghai Artificial Intelligence Lab, Shanghai 200232, Peoples R China
基金
中国国家自然科学基金;
关键词
Principal component analysis; Training; Robustness; Pollution measurement; Dimensionality reduction; Data models; Biological system modeling; Collaborative-enhanced learning; dimensionality reduction; principal component analysis (PCA); robust loss; REGRESSION;
D O I
10.1109/TNNLS.2023.3252602
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A conventional principal component analysis (PCA) frequently suffers from the disturbance of outliers, and thus, spectra of extensions and variations of PCA have been developed. However, all the existing extensions of PCA derive from the same motivation, which aims to alleviate the negative effect of the occlusion. In this article, we design a novel collaborative-enhanced learning framework that aims to highlight the pivotal data points in contrast. As for the proposed framework, only a part of well-fitting samples are adaptively highlighted, which indicates more significance during training. Meanwhile, the framework can collaboratively reduce the disturbance of the polluted samples as well. In other words, two contrary mechanisms could work cooperatively under the proposed framework. Based on the proposed framework, we further develop a pivotal-aware PCA (PAPCA), which utilizes the framework to simultaneously augment positive samples and constrain negative ones by retaining the rotational invariance property. Accordingly, extensive experiments demonstrate that our model has superior performance compared with the existing methods that only focus on the negative samples.
引用
收藏
页码:12201 / 12210
页数:10
相关论文
共 50 条
  • [21] Information theory divergences in principal component analysis
    Nakao, Eduardo K.
    Levada, Alexandre L. M.
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (01)
  • [22] Principal Component Analysis in Business Intelligence Applications
    Sevcenco, Ana-Maria
    Li, Kin Fun
    2013 EIGHTH INTERNATIONAL CONFERENCE ON P2P, PARALLEL, GRID, CLOUD AND INTERNET COMPUTING (3PGCIC 2013), 2013, : 399 - 404
  • [23] A Reconfigurable Hardware Architecture for Principal Component Analysis
    Korat, Uday A.
    Alimohammad, Amirhossein
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2019, 38 (05) : 2097 - 2113
  • [24] Generalized mean for robust principal component analysis
    Oh, Jiyong
    Kwak, Nojun
    PATTERN RECOGNITION, 2016, 54 : 116 - 127
  • [25] Principal component analysis based on graph embedding
    Ju, Fujiao
    Sun, Yanfeng
    Li, Jianqiang
    Zhang, Yaxiao
    Piao, Xinglin
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (05) : 7105 - 7116
  • [26] Principal Component Analysis on Graph-Hessian
    Pan, Yichen
    Liu, Weifeng
    Zhou, Yicong
    Nie, Liqiang
    2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 1494 - 1501
  • [27] Fair Principal Component Analysis and Filter Design
    Zalcberg, Gad
    Wiesel, Ami
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 4835 - 4842
  • [28] A Reconfigurable Hardware Architecture for Principal Component Analysis
    Uday A. Korat
    Amirhossein Alimohammad
    Circuits, Systems, and Signal Processing, 2019, 38 : 2097 - 2113
  • [29] Principal component analysis based on graph embedding
    Fujiao Ju
    Yanfeng Sun
    Jianqiang Li
    Yaxiao Zhang
    Xinglin Piao
    Multimedia Tools and Applications, 2023, 82 : 7105 - 7116
  • [30] A Review of Distributed Algorithms for Principal Component Analysis
    Wu, Sissi Xiaoxiao
    Wai, Hoi-To
    Li, Lin
    Scaglione, Anna
    PROCEEDINGS OF THE IEEE, 2018, 106 (08) : 1321 - 1340