Unified feature extraction framework based on contrastive learning

被引:10
作者
Zhang, Hongjie [1 ,2 ]
Qiang, Wenwen [5 ,6 ]
Zhang, Jinxin [1 ,3 ]
Chen, Yingyi [1 ,2 ,3 ,4 ]
Jing, Ling [7 ]
机构
[1] China Agr Univ, Coll Informat & Elect Engn, Beijing 100083, Peoples R China
[2] China Agr Univ, Natl Innovat Ctr Digital Fishery, Beijing 100083, Peoples R China
[3] Minist Agr & Rural Affairs, Key Lab Smart Farming Technol Aquat Anim & Livesto, Beijing 100083, Peoples R China
[4] Beijing Engn & Technol Res Ctr Internet Things Agr, Beijing 100083, Peoples R China
[5] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
[6] Chinese Acad Sci, Inst Software, Sci & Technol Integrated Informat Syst Lab, Beijing 100190, Peoples R China
[7] China Agr Univ, Coll Sci, Beijing 100083, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature extraction; Dimension reduction; Self -supervised learning; Contrastive learning; DIMENSIONALITY REDUCTION; RECOGNITION; PROJECTIONS;
D O I
10.1016/j.knosys.2022.110028
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature extraction is an efficient approach for alleviating the issue of dimensionality in high -dimensional data. Contrastive learning (CL), which is a popular self-supervised learning method, has recently attracted considerable attention. In this study, based on a new perspective of CL, we propose a unified framework that is suitable for both unsupervised and supervised feature extraction. In the framework, two CL graphs are first constructed to define the positive and negative pairs uniquely. Subsequently, the projection matrix is determined by minimizing the contrastive loss function. Moreover, the proposed framework considers positive and negative pairs to unify the unsupervised and supervised feature extraction. We propose three specific methods under this framework: unsupervised CL, supervised CL without local preservation, and supervised CL with local preservation. Finally, numerical experiments on six real datasets demonstrate the superior performance of the proposed framework compared to existing methods. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:13
相关论文
共 46 条
[1]   Eigenfaces vs. Fisherfaces: Recognition using class specific linear projection [J].
Belhumeur, PN ;
Hespanha, JP ;
Kriegman, DJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1997, 19 (07) :711-720
[2]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[3]   Dual subspace discriminative projection learning [J].
Belous, Gregg ;
Busch, Andrew ;
Gao, Yongsheng .
PATTERN RECOGNITION, 2021, 111
[4]  
Cai D., 2007, P 22 AAAI C ART INT, P528
[5]   A Comprehensive Survey of Graph Embedding: Problems, Techniques, and Applications [J].
Cai, HongYun ;
Zheng, Vincent W. ;
Chang, Kevin Chen-Chuan .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2018, 30 (09) :1616-1637
[6]  
Caron M., 2020, PROC NEURIPS
[7]   Adaptive Flexible Optimal Graph for Unsupervised Dimensionality Reduction [J].
Chen, Hong ;
Nie, Feiping ;
Wang, Rong ;
Li, Xuelong .
IEEE SIGNAL PROCESSING LETTERS, 2021, 28 :2162-2166
[8]   Semi-supervised double sparse graphs based discriminant analysis for dimensionality reduction [J].
Chen, Puhua ;
Jiao, Licheng ;
Liu, Fang ;
Zhao, Jiaqi ;
Zhao, Zhiqiang ;
Liu, Shuai .
PATTERN RECOGNITION, 2017, 61 :361-378
[9]  
Chen T, 2020, PR MACH LEARN RES, V119
[10]   Ensemble dimension reduction based on spectral disturbance for subspace clustering [J].
Chen, Xiaoyun ;
Wang, Qiaoping ;
Zhuang, Shanshan .
KNOWLEDGE-BASED SYSTEMS, 2021, 227