Deep Embedded Non-Redundant Clustering

被引:0
作者
Miklautz, Lukas [1 ]
Mautz, Dominik [2 ]
Altinigneli, Muzaffer Can [1 ,2 ]
Boehm, Christian [2 ,3 ]
Plant, Claudia [1 ,4 ]
机构
[1] Univ Vienna, Fac Comp Sci, Vienna, Austria
[2] Ludwig Maximilians Univ Munchen, Munich, Germany
[3] MCML, Munich, Germany
[4] Ds UniVie, Vienna, Austria
来源
THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE | 2020年 / 34卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Complex data types like images can be clustered in multiple valid ways. Non-redundant clustering aims at extracting those meaningful groupings by discouraging redundancy between clusterings. Unfortunately, clustering images in pixel space directly has been shown to work unsatisfactory. This has increased interest in combining the high representational power of deep learning with clustering, termed deep clustering. Algorithms of this type combine the non-linear embedding of an autoencoder with a clustering objective and optimize both simultaneously. None of these algorithms try to find multiple non-redundant clusterings. In this paper. we propose the novel Embedded Non-Redundant Clustering algorithm (ENRC). It is the first algorithm that combines neuralnetwork-based representation learning with non-redundant clustering. ENRC can find multiple highly non-redundant clusterings of different dimensionalities within a data set. This is achieved by (softly) assigning each dimension of the embedded space to the different clusterings. For instance, in image data sets it can group the objects by color, material and shape, without the need for explicit feature engineering. We show the viability of ENRC in extensive experiments and empirically demonstrate the advantage of combining non-linear representation learning with non-redundant clustering.
引用
收藏
页码:5174 / 5181
页数:8
相关论文
共 33 条
[1]  
Aljalbout Elie, 2018, ABS180107648 CORR
[2]  
[Anonymous], ABS151203385 CORR
[3]  
[Anonymous], 2017, 31st International Conference on Neural Information Processing Systems
[4]  
[Anonymous], 2011, NIPS
[5]  
Arthur D, 2007, PROCEEDINGS OF THE EIGHTEENTH ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, P1027
[6]   Representation Learning: A Review and New Perspectives [J].
Bengio, Yoshua ;
Courville, Aaron ;
Vincent, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (08) :1798-1828
[7]  
Chang YH, 2017, IEEE ENG MED BIO, P672, DOI 10.1109/EMBC.2017.8036914
[8]  
Chen D., 2017, P WORKSH 31 AAAI C A, P385
[9]  
Cui Y, 2007, IEEE DATA MINING, P133, DOI 10.1109/ICDM.2007.94
[10]  
Dilokthanakul N., 2016, DEEP UNSUPERVISED CL