Structure Preserving Encoding of Non-euclidean Similarity Data

被引:3
作者
Muench, Maximilian [1 ,2 ]
Raab, Christoph [1 ,3 ]
Biehl, Michael [2 ]
Schleif, Frank-Michael [1 ]
机构
[1] Univ Appl Sci Wurzburg Schweinfurt, Dept Comp Sci & Business Informat Syst, D-97074 Wurzburg, Germany
[2] Univ Groningen, Bernoulli Inst Math Comp Sci & Artificial Intelli, POB 407, NL-9700 AK Groningen, Netherlands
[3] Bielefeld Univ, Ctr Excellence, CITEC, Cognit Interact Technol, D-33619 Bielefeld, Germany
来源
ICPRAM: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS | 2020年
关键词
Non-euclidean; Similarity; Indefinite; Von Mises Iteration; Eigenvalue Correction; Shifting; Flipping; Clipping; CLASSIFICATION; RECOGNITION;
D O I
10.5220/0008955100430051
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain-specific proximity measures, like divergence measures in signal processing or alignment scores in bioinformatics, often lead to non-metric, indefinite similarities or dissimilarities. However, many classical learning algorithms like kernel machines assume metric properties and struggle with such metric violations. For example, the classical support vector machine is no longer able to converge to an optimum. One possible direction to solve the indefiniteness problem is to transform the non-metric (dis-)similarity data into positive (semi-)definite matrices. For this purpose, many approaches have been proposed that adapt the eigen-spectrum of the given data such that positive definiteness is ensured. Unfortunately, most of these approaches modify the eigenspectrum in such a strong manner that valuable information is removed or noise is added to the data. In particular, the shift operation has attracted a lot of interest in the past few years despite its frequently re-occurring disadvantages. In this work, we propose a modified advanced shift correction method that enables the preservation of the eigenspectrum structure of the data by means of a low-rank approximated nullspace correction. We compare our advanced shift to classical eigenvalue corrections like eigenvalue clipping, flipping, squaring, and shifting on several benchmark data. The impact of a low-rank approximation on the data's eigenspectrum is analyzed.
引用
收藏
页码:43 / 51
页数:9
相关论文
共 35 条
[1]   Large margin classification with indefinite similarities [J].
Alabdulmohsin, Ibrahim ;
Cisse, Moustapha ;
Gao, Xin ;
Zhang, Xiangliang .
MACHINE LEARNING, 2016, 103 (02) :215-237
[2]  
[Anonymous], 2008, 19 INT C PATT REC IC
[3]  
[Anonymous], 2005, The Dissimilarity Representation for Pattern Recognition
[4]  
[Anonymous], 2000, NATURE STAT LEARNING
[5]   Probabilistic Classification Vector Machines [J].
Chen, Huanhuan ;
Tino, Peter ;
Yao, Xin .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (06) :901-914
[6]  
Duin RPW, 2010, LECT NOTES COMPUT SC, V6218, P324, DOI 10.1007/978-3-642-14980-1_31
[7]   Dealing with non-metric dissimilarities in fuzzy central clustering algorithms [J].
Filippone, Maurizio .
INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2009, 50 (02) :363-384
[8]  
Gu S., 2012, P 26 AAAI C AI JUL
[9]  
Gusfield D., 1997, ACM SIGACT NEWS, DOI 10.1017/CBO9780511574931
[10]   Feature space interpretation of SVMs with indefinite kernels [J].
Haasdonk, B .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2005, 27 (04) :482-492