共 50 条
Confidence Calibration in a Multiyear Geopolitical Forecasting Competition
被引:35
|作者:
Moore, Don A.
[1
]
Swift, Samuel A.
[2
]
Minster, Angela
[3
]
Mellers, Barbara
[3
]
Ungar, Lyle
[3
]
Tetlock, Philip
[3
]
Yang, Heather H. J.
[4
]
Tenney, Elizabeth R.
[5
]
机构:
[1] Univ Calif Berkeley, Berkeley, CA 94720 USA
[2] Betterment LLC, New York, NY 10010 USA
[3] Univ Penn, Philadelphia, PA 19104 USA
[4] MIT, Cambridge, MA 02139 USA
[5] Univ Utah, Salt Lake City, UT 84112 USA
关键词:
confidence;
overconfidence;
forecasting;
prediction;
PROBABILITY JUDGMENT;
OVERCONFIDENCE;
ACCURACY;
ERROR;
INFORMATION;
PERFORMANCE;
AVERAGE;
BIAS;
UNDERCONFIDENCE;
PREDICTIONS;
D O I:
10.1287/mnsc.2016.2525
中图分类号:
C93 [管理学];
学科分类号:
12 ;
1201 ;
1202 ;
120202 ;
摘要:
This research examines the development of confidence and accuracy over time in the context of forecasting. Although overconfidence has been studied in many contexts, little research examines its progression over long periods of time or in consequential policy domains. This study employs a unique data set from a geopolitical forecasting tournament spanning three years in which thousands of forecasters predicted the outcomes of hundreds of events. We sought to apply insights from research to structure the questions, interactions, and elicitations to improve forecasts. Indeed, forecasters' confidence roughly matched their accuracy. As information came in, accuracy increased. Confidence increased at approximately the same rate as accuracy, and good calibration persisted. Nevertheless, there was evidence of a small amount of overconfidence (3%), especially on the most confident forecasts. Training helped reduce overconfidence, and team collaboration improved forecast accuracy. Together, teams and training reduced overconfidence to 1%. Our results provide reason for tempered optimism regarding confidence calibration and its development over time in consequential field contexts.
引用
收藏
页码:3552 / 3565
页数:14
相关论文