NCLDR: Nearest-Neighbor Contrastive Learning with Dual Correlation Loss for Dimensionality Reduction

被引:0
作者
Feng, Linlin [1 ]
Wang, Changpeng [1 ]
Liu, Pei [1 ]
Ge, Kangjian [1 ]
Zhang, Jiangshe [2 ]
机构
[1] Changan Univ, Sch Sci, Xian 710064, Peoples R China
[2] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Peoples R China
关键词
Dimensionality reduction; Contrastive learning; Neighbor embedding;
D O I
10.1016/j.neucom.2024.127848
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dimensionality reduction is an efficient method for alleviating the issue of dimensionality in high-dimensional data. As a popular self-supervised learning method, contrastive learning has recently garnered considerable attention. In this paper, we propose NCLDR: Nearest-Neighbor Contrastive Learning with Dual Correlation Loss for Dimensionality Reduction, a novel dimensionality reduction method that is porting a contrastive learning framework to the specific task of dimensionality reduction. Firstly, NCLDR uses the nearest-neighbor to construct feature pairs from the training set itself. Afterwards, to decorrelate feature variables that produce representations invariant across such pairs, a basic multi-layer perceptron (MLP) network architecture with a dual correlation loss function is designed. Compared to most dimensionality reduction methods, NCLDR bypasses the complexity of optimizing kNN graphs and facilitates the embedding of out-of-sample data. Additionally, it also alleviates the issue of "dimensional collapse"in the low-dimensional representation space. Finally, experimental results demonstrate that the proposed method achieves significant improvements over state-of-the-art dimensionality reduction methods.
引用
收藏
页数:8
相关论文
共 46 条
[1]   Minimum-Distortion Embedding [J].
Agrawal, Akshay ;
Ali, Alnur ;
Boyd, Stephen .
FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2021, 14 (03) :211-378
[2]  
[Anonymous], 2011, AISTATS
[3]   NCVis: Noise Contrastive Approach for Scalable Visualization [J].
Artemenkov, Aleksandr ;
Panov, Maxim .
WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, :2941-2947
[4]  
Bardes A., 2021, ARXIV
[5]   Dimensionality reduction for visualizing single-cell data using UMAP [J].
Becht, Etienne ;
McInnes, Leland ;
Healy, John ;
Dutertre, Charles-Antoine ;
Kwok, Immanuel W. H. ;
Ng, Lai Guan ;
Ginhoux, Florent ;
Newell, Evan W. .
NATURE BIOTECHNOLOGY, 2019, 37 (01) :38-+
[6]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[7]  
Böhm JN, 2022, Arxiv, DOI arXiv:2210.09879
[8]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[9]  
Caron M, 2020, ADV NEUR IN, V33
[10]   Deep Clustering for Unsupervised Learning of Visual Features [J].
Caron, Mathilde ;
Bojanowski, Piotr ;
Joulin, Armand ;
Douze, Matthijs .
COMPUTER VISION - ECCV 2018, PT XIV, 2018, 11218 :139-156