A Risk Profile for Information Fusion Algorithms

被引:8
作者
Nelson, Kenric P. [1 ]
Scannell, Brian J. [1 ]
Landau, Herbert [2 ]
机构
[1] Raytheon Integrated Def Syst, Woburn, MA 01801 USA
[2] Raytheon Integrated Def Syst, Arlington, VA 22202 USA
关键词
Tsallis entropy; proper scoring rules; information fusion; machine learning; CENTRAL-LIMIT-THEOREM; PROPER SCORING RULES; EXPONENTIAL FUNCTION; GENERALIZED ENTROPY; TSALLIS; CLASSIFIERS; STATISTICS; CALCULUS; MODELS; ERROR;
D O I
10.3390/e13081518
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
E.T. Jaynes, originator of the maximum entropy interpretation of statistical mechanics, emphasized that there is an inevitable trade-off between the conflicting requirements of robustness and accuracy for any inferencing algorithm. This is because robustness requires discarding of information in order to reduce the sensitivity to outliers. The principal of nonlinear statistical coupling, which is an interpretation of the Tsallis entropy generalization, can be used to quantify this trade-off. The coupled-surprisal, -1n(kappa)(p)-p(kappa)-1/kappa, is a generalization of Shannon surprisal or the logarithmic scoring rule, given a forecast p of a true event by an inferencing algorithm. The coupling parameter kappa = 1-q, where q is the Tsallis entropy index, is the degree of nonlinear coupling between statistical states. Positive (negative) values of nonlinear coupling decrease (increase) the surprisal information metric and thereby biases the risk in favor of decisive (robust) algorithms relative to the Shannon surprisal (kappa = 0). We show that translating the average coupled-surprisal to an effective probability is equivalent to using the generalized mean of the true event probabilities as a scoring rule. The metric is used to assess the robustness, accuracy, and decisiveness of a fusion algorithm. We use a two-parameter fusion algorithm to combine input probabilities from N sources. The generalized mean parameter 'alpha' varies the degree of smoothing and raising to a power N(beta) with beta between 0 and 1 provides a model of correlation.
引用
收藏
页码:1518 / 1532
页数:15
相关论文
共 42 条
[1]   Stability of Tsallis entropy and instabilities of Renyi and normalized Tsallis entropies:: A basis for q-exponential distributions -: art. no. 046134 [J].
Abe, S .
PHYSICAL REVIEW E, 2002, 66 (04) :6
[2]   Integration of stochastic models by minimizing α-divergence [J].
Amari, Shun-ichi .
NEURAL COMPUTATION, 2007, 19 (10) :2780-2796
[3]  
[Anonymous], MACHINE LEARNING REP
[4]  
[Anonymous], 2004, COMBINING PATTERN CL, DOI DOI 10.1002/0471660264
[5]  
[Anonymous], 2003, Probability Theory
[6]   Risk aversion in economic transactions [J].
Anteneodo, C ;
Tsallis, C ;
Martinez, AS .
EUROPHYSICS LETTERS, 2002, 59 (05) :635-641
[7]  
ANTENEODO C, 2003, ARXIVCONDMAT0306605V
[8]   Superstatistics [J].
Beck, C ;
Cohen, EGD .
PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2003, 322 (1-4) :267-275
[9]   Generalised information and entropy measures in physics [J].
Beck, Christian .
CONTEMPORARY PHYSICS, 2009, 50 (04) :495-510
[10]   A possible deformed algebra and calculus inspired in nonextensive thermostatistics [J].
Borges, EP .
PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2004, 340 (1-3) :95-101