Graph Embedding With Data Uncertainty

被引:2
作者
Laakom, Firas [1 ]
Raitoharju, Jenni [2 ]
Passalis, Nikolaos [3 ]
Iosifidis, Alexandros [4 ]
Gabbouj, Moncef [1 ]
机构
[1] Tampere Univ, Fac Informat Technol & Commun Sci, Tampere 33100, Finland
[2] Finnish Environm Inst, Programme Environm Informat, Jyvaskyla 40500, Finland
[3] Aristotle Univ Thessaloniki, Dept Informat, Thessaloniki 54124, Greece
[4] Aarhus Univ, Dept Elect & Comp Engn, DK-8000 Aarhus, Denmark
基金
芬兰科学院;
关键词
Uncertainty; Data models; Principal component analysis; Optimization; Gaussian distribution; Eigenvalues and eigenfunctions; Training data; Graph embedding; subspace learning; dimensionality reduction; uncertainty estimation; spectral learning; DIMENSIONALITY REDUCTION; ROBUST-PCA; CLASSIFIERS; FRAMEWORK;
D O I
10.1109/ACCESS.2022.3155233
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Spectral-based subspace learning is a common data preprocessing step in many machine learning pipelines. The main aim is to learn a meaningful low dimensional embedding of the data. However, most subspace learning methods do not take into consideration possible measurement inaccuracies or artifacts that can lead to data with high uncertainty. Thus, learning directly from raw data can be misleading and can negatively impact the accuracy. In this paper, we propose to model artifacts in training data using probability distributions; each data point is represented by a Gaussian distribution centered at the original data point and having a variance modeling its uncertainty. We reformulate the Graph Embedding framework to make it suitable for learning from distributions and we study as special cases the Linear Discriminant Analysis and the Marginal Fisher Analysis techniques. Furthermore, we propose two schemes for modeling data uncertainty based on pair-wise distances in an unsupervised and a supervised contexts.
引用
收藏
页码:24232 / 24239
页数:8
相关论文
共 30 条
[1]  
[Anonymous], 2005, Advances in neural information processing systems
[2]   Learning graph affinities for spectral graph-based salient object detection [J].
Aytekin, Caglar Caglar ;
Iosifidis, Alexandros ;
Kiranyaz, Serkan ;
Gabbouj, Moncef .
PATTERN RECOGNITION, 2017, 64 :159-167
[3]   Graph Embedded Nonparametric Mutual Information For Supervised Dimensionality Reduction [J].
Bouzas, Dimitrios ;
Arvanitopoulos, Nikolaos ;
Tefas, Anastasios .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (05) :951-963
[4]   A nonlinear dimensionality reduction framework using smooth geodesics [J].
Gajamannage, Kelum ;
Paffenroth, Randy ;
Bollt, Erik M. .
PATTERN RECOGNITION, 2019, 87 :226-236
[5]  
Huang R, 2002, INT C PATT RECOG, P29, DOI 10.1109/ICPR.2002.1047787
[6]   Multi-class Support Vector Machine classifiers using intrinsic and penalty graphs [J].
Iosifidis, Alexandros ;
Gabbouj, Moncef .
PATTERN RECOGNITION, 2016, 55 :231-246
[7]   On the Optimal Class Representation in Linear Discriminant Analysis [J].
Iosifidis, Alexandros ;
Tefas, Anastasios ;
Pitas, Ioannis .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (09) :1491-1497
[8]  
Krizhevsky A., 2012, Handbook Syst. Autoimmune Dis
[9]   Acquiring linear subspaces for face recognition under variable lighting [J].
Lee, KC ;
Ho, J ;
Kriegman, DJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (05) :684-698
[10]   Robust and Sparse Linear Discriminant Analysis via an Alternating Direction Method of Multipliers [J].
Li, Chun-Na ;
Shao, Yuan-Hai ;
Yin, Wotao ;
Liu, Ming-Zeng .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (03) :915-926