INFORMATION THEORETIC STRUCTURE LEARNING WITH CONFIDENCE

被引:0
作者
Moon, Kevin R. [1 ]
Noshad, Morteza [2 ]
Sekeh, Salimeh Yasaei [2 ]
Hero, Alfred O., III [2 ]
机构
[1] Yale Univ, Dept Genet, New Haven, CT 06520 USA
[2] Univ Michigan, Elect Engn & Comp Sci, Ann Arbor, MI 48109 USA
来源
2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP) | 2017年
关键词
mutual information; structure learning; ensemble stimation; hypothesis testing;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Information theoretic measures (e.g. the Kullback Liebler divergence and Shannon mutual information) have been used for exploring possibly nonlinear multivariate dependencies in high dimension. If these dependencies are assumed to follow a Markov factor graph model, this exploration process is called structure discovery. For discrete-valued samples, estimates of the information divergence over the parametric class of multinomial models lead to structure discovery methods whose mean squared error achieves parametric convergence rates as the sample size grows. However, a naive application of this method to continuous nonparametric multivariate models converges much more slowly. In this paper we introduce a new method for nonparametric structure discovery that uses weighted ensemble divergence estimators that achieve parametric convergence rates and obey an asymptotic central limit theorem that facilitates hypothesis testing and other types of statistical validation.
引用
收藏
页码:6095 / 6099
页数:5
相关论文
共 32 条
[1]   LEARNING LOOPY GRAPHICAL MODELS WITH LATENT VARIABLES: EFFICIENT METHODS AND GUARANTEES [J].
Anandkumar, Animashree ;
Valluvan, Ragupathyraj .
ANNALS OF STATISTICS, 2013, 41 (02) :401-435
[2]  
[Anonymous], 2014, PROBABILISTIC REASON
[3]  
[Anonymous], 1987, PROBLEMY PEREDACHI I
[4]  
[Anonymous], 1980, MARKOV RANDOM FIELDS, DOI DOI 10.1090/CONM/001
[5]  
[Anonymous], 2014, P ADV NEUR INF PROC
[6]  
[Anonymous], 2012, Introduction to Graphical Modelling
[7]   Information-Geometric Dimensionality Reduction [J].
Carter, Kevin M. ;
Raich, Raviv ;
Finn, William G. ;
Hero, Alfred O., III .
IEEE SIGNAL PROCESSING MAGAZINE, 2011, 28 (02) :89-99
[8]   FINE: Fisher Information Nonparametric Embedding [J].
Carter, Kevin M. ;
Raich, Raviv ;
Finn, William G. ;
Hero, Alfred O., III .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2009, 31 (11) :2093-U195
[9]  
Choi MJ, 2011, J MACH LEARN RES, V12, P1771
[10]   APPROXIMATING DISCRETE PROBABILITY DISTRIBUTIONS WITH DEPENDENCE TREES [J].
CHOW, CK ;
LIU, CN .
IEEE TRANSACTIONS ON INFORMATION THEORY, 1968, 14 (03) :462-+