Two-Sample Testing can be as hard as Structure Learning in Ising Models: Minimax Lower Bounds

被引:0
作者
Gangrade, Aditya [1 ]
Nazer, Bobak [1 ]
Saligrama, Venkatesh [1 ]
机构
[1] Boston Univ, Boston, MA 02215 USA
来源
2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP) | 2018年
关键词
High-dimensional Inference; Two-Sample Testing; Ising Models; Sample Complexity; Change Detection;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Consider the following structural two-sample testing problem: given two sets of sample drawn from Ising models, determine whether the underlying network structure has changed. In [1] we showed that for Ising models over p variables with network structures that have degree bounded by d, under mild conditions on the model parameters, the sample complexity of this problem is very close to that of determining either of the network structures. Therefore, the naive scheme of learning and then comparing the structures of both sets of samples is near data-optimal. However, the minimax lower bounds in [1] relied on Ising models that differed in only one edge, which leads to the natural follow-up question: are large changes significantly easier to detect? We extend the previously developed framework to consider this problem, and show that, in a certain parameter regime, large changes do not provide any significant improvement in the number of necessary samples for reliable two-sample testing.
引用
收藏
页码:6931 / 6935
页数:5
相关论文
共 13 条
[1]   HIGH-DIMENSIONAL STRUCTURE ESTIMATION IN ISING MODELS: LOCAL SEPARATION CRITERION [J].
Anandkumar, Animashree ;
Tan, Vincent Y. F. ;
Huang, Furong ;
Willsky, Alan S. .
ANNALS OF STATISTICS, 2012, 40 (03) :1346-1375
[2]  
[Anonymous], 2016, PROC ADV NEURAL INF
[3]  
[Anonymous], 2009, ADV NEURAL INFORM PR
[4]  
[Anonymous], 2017, Behaviormetrika
[5]   Efficiently Learning Ising Models on Arbitrary Graphs [J].
Bresler, Guy .
STOC'15: PROCEEDINGS OF THE 2015 ACM SYMPOSIUM ON THEORY OF COMPUTING, 2015, :771-782
[6]  
Fazayeli F, 2016, PR MACH LEARN RES, V48
[7]  
Gangrade A., 2017, COMM CONTR COMP 2017
[8]   SUPPORT CONSISTENCY OF DIRECT SPARSE-CHANGE LEARNING IN MARKOV NETWORKS [J].
Liu, Song ;
Suzuki, Taiji ;
Relator, Raissa ;
Sese, Jun ;
Sugiyama, Masashi ;
Fukumizu, Kenji .
ANNALS OF STATISTICS, 2017, 45 (03) :959-990
[9]   HIGH-DIMENSIONAL ISING MODEL SELECTION USING l1-REGULARIZED LOGISTIC REGRESSION [J].
Ravikumar, Pradeep ;
Wainwright, Martin J. ;
Lafferty, John D. .
ANNALS OF STATISTICS, 2010, 38 (03) :1287-1319
[10]   Information-Theoretic Limits of Selecting Binary Graphical Models in High Dimensions [J].
Santhanam, Narayana P. ;
Wainwright, Martin J. .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2012, 58 (07) :4117-4134