Two-sample Testing Using Deep Learning

被引:0
作者
Kirchler, Matthias [1 ,2 ]
Khorasani, Shahryar [1 ]
Kloft, Marius [2 ,3 ]
Lippert, Christoph [1 ,4 ]
机构
[1] Univ Potsdam, Hasso Plattner Inst Digital Engn, Potsdam, Germany
[2] Tech Univ Kaiserslautern, Kaiserslautern, Germany
[3] Univ Southern Calif, Los Angeles, CA 90007 USA
[4] Hasso Plattner Inst Digital Hlth Mt Sinai, New York, NY USA
来源
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108 | 2020年 / 108卷
基金
美国国家卫生研究院; 加拿大健康研究院;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a two-sample testing procedure based on learned deep neural network representations. To this end, we define two test statistics that perform an asymptotic location test on data samples mapped onto a hidden layer. The tests are consistent and asymptotically control the type-1 error rate. Their test statistics can be evaluated in linear time (in the sample size). Suitable data representations are obtained in a data-driven way, by solving a supervised or unsupervised transfer-learning task on an auxiliary (potentially distinct) data set. If no auxiliary data is available, we split the data into two chunks: one for learning representations and one for computing the test statistic. In experiments on audio samples, natural images and three-dimensional neuroimaging data our tests yield significant decreases in type-2 error rate (up to 35 percentage points) compared to state-of-the-art two-sample tests such as kernel-methods and classifier two-sample tests.*
引用
收藏
页码:1387 / 1397
页数:11
相关论文
共 43 条