Intuitive and Axiomatic Arguments for Quantifying Diagnostic Test Performance in Units of Information

被引:12
作者
Benish, W. A. [1 ,2 ]
机构
[1] Vet Affairs Med Ctr, Louis Stokes Cleveland Dept, Cleveland, OH USA
[2] Case Western Reserve Univ, Cleveland, OH 44106 USA
关键词
Diagnostic tests; information theory; mutual information; MUTUAL-INFORMATION; REGISTRATION; INDEX; CURVE;
D O I
10.3414/ME0627
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Objectives: Mutual information is a fundamental concept of information theory that quantifies the expected value of the amount of information that diagnostic testing provides about a patient's disease state. The purpose of this report is to provide both intuitive and axiomatic descriptions of mutual information and, thereby, promote the use of this statistic as a measure of diagnostic test performance. Methods: We derive the mathematical expression for mutual information from the intuitive assumption that diagnostic information is the average amount that diagnostic testing reduces our surprise upon ultimately learning a patient's diagnosis. This concept is formalized by defining "surprise" as the surprisal, a function that quantifies the unlikelihood of an event. Mutual information is also shown to be the only function that conforms to a set of axioms which are reasonable requirements of a measure of diagnostic information. These axioms are related to the axioms of information theory used to derive the expression for entropy. Results: Both approaches to defining mutual information lead to the known relationship that mutual information is equal to the pretest uncertainty of the disease state minus the expected value of the posttest uncertainty of the disease state. Mutual information also has the property of being additive when a test provides information about independent health problems. Conclusion: Mutual information is the best single measure of the ability of a diagnostic test to discriminate among the possible disease states.
引用
收藏
页码:552 / 557
页数:6
相关论文
共 25 条
[1]  
Aczel J., 1975, On measures of information and their characterizations
[2]  
[Anonymous], 1964, MATH THEORY COMMUNIC
[3]  
[Anonymous], 1991, ELEMENTS INFORM THEO
[4]  
Benish WA, 2003, METHOD INFORM MED, V42, P260
[5]   Relative entropy as a measure of diagnostic information [J].
Benish, WA .
MEDICAL DECISION MAKING, 1999, 19 (02) :202-206
[6]  
Benish WA, 2002, METHOD INFORM MED, V41, P114
[7]   An information-theoretical model for breast cancer detection [J].
Blokh, D. ;
Zurgil, N. ;
Stambler, I. ;
Afrimzon, E. ;
Shafran, Y. ;
Korech, E. ;
Sandbank, J. ;
Deutsh, M. .
METHODS OF INFORMATION IN MEDICINE, 2008, 47 (04) :322-327
[8]   SNP Sets Selection under Mutual Information Criterion, Application to F7/FVII dataset [J].
Brunel, H. ;
Perera, A. ;
Buil, A. ;
Sabater-Lleal, M. ;
Souto, J. C. ;
Fontcuberta, J. ;
Vallverdu, M. ;
Soria, J. M. ;
Caminal, P. .
2008 30TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, VOLS 1-8, 2008, :3783-+
[9]  
Chrástek R, 2004, METHOD INFORM MED, V43, P336
[10]   APPLICATION OF INFORMATION-THEORY TO CLINICAL DIAGNOSTIC TESTING - THE ELECTROCARDIOGRAPHIC STRESS TEST [J].
DIAMOND, GA ;
HIRSCH, M ;
FORRESTER, JS ;
STANILOFF, HM ;
VAS, R ;
HALPERN, SW ;
SWAN, HJC .
CIRCULATION, 1981, 63 (04) :915-921