Uncorrelation and Evenness: a New Diversity-Promoting Regularizer

被引:0
作者
Xie, Pengtao [1 ,2 ]
Singh, Aarti [1 ]
Xing, Eric P. [2 ]
机构
[1] Carnegie Mellon Univ, Machine Learning Dept, Pittsburgh, PA 15213 USA
[2] Petuum Inc, Pittsburgh, PA 15222 USA
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70 | 2017年 / 70卷
基金
美国国家科学基金会; 美国国家卫生研究院;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Latent space models (LSMs) provide a principled and effective way to extract hidden patterns from observed data. To cope with two challenges in LSMs: (1) how to capture infrequent patterns when pattern frequency is imbalanced and (2) how to reduce model size without sacrificing their expressiveness, several studies have been proposed to "diversify" LSMs, which design regularizers to encourage the components therein to be "diverse". In light of the limitations of existing approaches, we design a new diversity-promoting regularizer by considering two factors: uncorrelation and evenness, which encourage the components to be uncorrelated and to play equally important roles in modeling data. Formally, this amounts to encouraging the covariance matrix of the components to have more uniform eigenvalues. We apply the regularizer to two LSMs and develop an efficient optimization algorithm. Experiments on healthcare, image and text data demonstrate the effectiveness of the regularizer.
引用
收藏
页数:10
相关论文
共 51 条
[1]  
[Anonymous], 2016, ARXIV160905284
[2]  
[Anonymous], 2016, ARXIV160602270 ARXIV160602270
[3]  
[Anonymous], 2016, ARXIV160601549
[4]  
[Anonymous], 2007, Geometry of quantum states: an introduction to quantum entanglement
[5]  
[Anonymous], 2002, NIPS
[6]  
[Anonymous], NEURAL COMPUTATION
[7]  
[Anonymous], 2012, ARXIV PREPRINT ARXIV
[8]  
Bao Yebo, 2013, INCOHERENT TRAINING
[9]  
Bishop CM, 1998, NATO ADV SCI I D-BEH, V89, P371
[10]   Build, Compute, Critique, Repeat: Data Analysis with Latent Variable Models [J].
Blei, David M. .
ANNUAL REVIEW OF STATISTICS AND ITS APPLICATION, VOL 1, 2014, 1 :203-232