Predicting the replicability of social science lab experiments

被引:52
作者
Altmejd, Adam [1 ,2 ]
Dreber, Anna [1 ,3 ]
Forsell, Eskil [1 ]
Huber, Juergen [3 ]
Imai, Taisuke [4 ]
Johannesson, Magnus [1 ]
Kirchler, Michael [3 ]
Nave, Gideon [5 ]
Camerer, Colin [6 ]
机构
[1] Stockholm Sch Econ, Dept Econ, Stockholm, Sweden
[2] Stockholm Univ, SOFI, Stockholm, Sweden
[3] Univ Innsbruck, Innsbruck, Austria
[4] Ludwig Maximilians Univ Munchen, Munich, Germany
[5] Univ Penn, Wharton Sch, Philadelphia, PA 19104 USA
[6] CALTECH, Pasadena, CA 91125 USA
来源
PLOS ONE | 2019年 / 14卷 / 12期
基金
奥地利科学基金会;
关键词
REPLICATION; REPRODUCIBILITY; PUBLICATION; PSYCHOLOGY;
D O I
10.1371/journal.pone.0225826
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
We measure how accurately replication of experimental results can be predicted by black-box statistical models. With data from four large-scale replication projects in experimental psychology and economics, and techniques from machine learning, we train predictive models and study which variables drive predictable replication. The models predicts binary replication with a cross-validated accuracy rate of 70% (AUC of 0.77) and estimates of relative effect sizes with a Spearman rho of 0.38. The accuracy level is similar to market-aggregated beliefs of peer scientists [1, 2]. The predictive power is validated in a pre-registered out of sample test of the outcome of [3], where 71% (AUC of 0.73) of replications are predicted correctly and effect size correlations amount to rho = 0.25. Basic features such as the sample and effect sizes in original papers, and whether reported effects are single-variable main effects or two-variable interactions, are predictive of successful replication. The models presented in this paper are simple tools to produce cheap, prognostic replicability metrics. These models could be useful in institutionalizing the process of evaluation of new findings and guiding resources to those direct replications that are likely to be most informative.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015
    Camerer, Colin F.
    Dreber, Anna
    Holzmeister, Felix
    Ho, Teck-Hua
    Huber, Juegen
    Johannesson, Magnus
    Kirchler, Michael
    Nave, Gideon
    Nosek, Brian A.
    Pfeiffer, Thomas
    Altmejd, Adam
    Buttrick, Nick
    Chan, Taizan
    Chen, Yiling
    Forsell, Eskil
    Gampa, Anup
    Heikensten, Emma
    Hummer, Lily
    Imai, Taisuke
    Isaksson, Siri
    Manfredi, Dylan
    Rose, Julia
    Wagenmakers, Eric-Jan
    Wu, Hang
    NATURE HUMAN BEHAVIOUR, 2018, 2 (09): : 637 - 644
  • [2] Predicting the replicability of social and behavioural science claims in COVID-19 preprints
    Marcoci, Alexandru
    Wilkinson, David P.
    Vercammen, Ans
    Wintle, Bonnie C.
    Abatayo, Anna Lou
    Baskin, Ernest
    Berkman, Henk
    Buchanan, Erin M.
    Capitan, Sara
    Capitan, Tabare
    Chan, Ginny
    Cheng, Kent Jason G.
    Coupe, Tom
    Dryhurst, Sarah
    Duan, Jianhua
    Edlund, John E.
    Errington, Timothy M.
    Fedor, Anna
    Fidler, Fiona
    Field, James G.
    Fox, Nicholas
    Fraser, Hannah
    Freeman, Alexandra L. J.
    Hanea, Anca
    Holzmeister, Felix
    Hong, Sanghyun
    Huggins, Raquel
    Huntington-Klein, Nick
    Johannesson, Magnus
    Jones, Angela M.
    Kapoor, Hansika
    Kerr, John
    Struhl, Melissa Kline
    Kolczynska, Marta
    Liu, Yang
    Loomas, Zachary
    Luis, Brianna
    Mendez, Esteban
    Miske, Olivia
    Mody, Fallon
    Nast, Carolin
    Nosek, Brian A.
    Simon Parsons, E.
    Pfeiffer, Thomas
    Reed, W. Robert
    Roozenbeek, Jon
    Schlyfestone, Alexa R.
    Schneider, Claudia R.
    Soh, Andrew
    Song, Zhongchen
    NATURE HUMAN BEHAVIOUR, 2025, 9 (02): : 287 - 304
  • [3] Examining the replicability of online experiments selected by a decision market
    Holzmeister, Felix
    Johannesson, Magnus
    Camerer, Colin F.
    Chen, Yiling
    Ho, Teck-Hua
    Hoogeveen, Suzanne
    Huber, Juergen
    Imai, Noriko
    Imai, Taisuke
    Jin, Lawrence
    Kirchler, Michael
    Ly, Alexander
    Mandl, Benjamin
    Manfredi, Dylan
    Nave, Gideon
    Nosek, Brian A.
    Pfeiffer, Thomas
    Sarafoglou, Alexandra
    Schwaiger, Rene
    Wagenmakers, Eric-Jan
    Walden, Viking
    Dreber, Anna
    NATURE HUMAN BEHAVIOUR, 2025, 9 (02): : 316 - 330
  • [4] Editorial: Replicability in Cognitive Science
    Strickland, Brent
    De Cruz, Helen
    REVIEW OF PHILOSOPHY AND PSYCHOLOGY, 2021, 12 (01) : 1 - 7
  • [5] Philosophy of science and the replicability crisis
    Romero, Felipe
    PHILOSOPHY COMPASS, 2019, 14 (11)
  • [6] Evaluating replicability of laboratory experiments in economics
    Camerer, Colin F.
    Dreber, Anna
    Forsell, Eskil
    Ho, Teck-Hua
    Huber, Juergen
    Johannesson, Magnus
    Kirchler, Michael
    Almenberg, Johan
    Altmejd, Adam
    Chan, Taizan
    Heikensten, Emma
    Holzmeister, Felix
    Imai, Taisuke
    Isaksson, Siri
    Nave, Gideon
    Pfeiffer, Thomas
    Razen, Michael
    Wu, Hang
    SCIENCE, 2016, 351 (6280) : 1433 - 1436
  • [7] Promoting Reproducibility and Replicability in Political Science
    Brodeur, Abel
    Esterling, Kevin
    Ankel-Peters, Joerg
    Bueno, Natalia S.
    Desposato, Scott
    Dreber, Anna
    Genovese, Federica
    Green, Donald P.
    Hepplewhite, Matthew
    de la Guardia, Fernando Hoces
    Johannesson, Magnus
    Kotsadam, Andreas
    Miguel, Edward
    Velez, Yamil R.
    Young, Lauren
    RESEARCH & POLITICS, 2024, 11 (01)
  • [8] #EEGManyLabs: Investigating the replicability of influential EEG experiments
    Pavlov, Yuri G.
    Adamian, Nika
    Appelhoff, Stefan
    Arvaneh, Mahnaz
    Benwell, Christopher S. Y.
    Beste, Christian
    Bland, Amy R.
    Bradford, Daniel E.
    Bublatzky, Florian
    Busch, Niko A.
    Clayson, Peter E.
    Cruse, Damian
    Czeszumski, Artur
    Dreber, Anna
    Dumas, Guillaume
    Ehinger, Benedikt
    Ganis, Giorgio
    He, Xun
    Hinojosa, Jose A.
    Huber-Huber, Christoph
    Inzlicht, Michael
    Jack, Bradley N.
    Johannesson, Magnus
    Jones, Rhiannon
    Kalenkovich, Evgenii
    Kaltwasser, Laura
    Karimi-Rouzbahani, Hamid
    Keil, Andreas
    Konig, Peter
    Kouara, Layla
    Kulke, Louisa
    Ladouceur, Cecile D.
    Langer, Nicolas
    Liesefeld, Heinrich R.
    Luque, David
    MacNamara, Annmarie
    Mudrik, Liad
    Muthuraman, Muthuraman
    Neal, Lauren B.
    Nilsonne, Gustav
    Niso, Guiomar
    Ocklenburg, Sebastian
    Oostenveld, Robert
    Pernet, Cyril R.
    Pourtois, Gilles
    Ruzzoli, Manuela
    Sass, Sarah M.
    Schaefer, Alexandre
    Senderecka, Magdalena
    Snyder, Joel S.
    CORTEX, 2021, 144 : 213 - 229
  • [9] Replicability, Robustness, and Reproducibility in Psychological Science
    Nosek, Brian A.
    Hardwicke, Tom E.
    Moshontz, Hannah
    Allard, Aurelien
    Corker, Katherine S.
    Dreber, Anna
    Fidler, Fiona
    Hilgard, Joe
    Struhl, Melissa Kline
    Nuijten, Michele B.
    Rohrer, Julia M.
    Romero, Felipe
    Scheel, Anne M.
    Scherer, Laura D.
    Schoenbrodt, Felix D.
    Vazire, Simine
    ANNUAL REVIEW OF PSYCHOLOGY, 2022, 73 : 719 - 748
  • [10] Improving the Replicability of Psychological Science Through Pedagogy
    Hawkins, Robert X. D.
    Smith, Eric N.
    Au, Carolyn
    Arias, Juan Miguel
    Catapano, Rhia
    Hermann, Eric
    Keil, Martin
    Lampinen, Andrew
    Raposo, Sarah
    Reynolds, Jesse
    Salehi, Shima
    Salloum, Justin
    Tan, Jed
    Frank, Michael C.
    ADVANCES IN METHODS AND PRACTICES IN PSYCHOLOGICAL SCIENCE, 2018, 1 (01) : 7 - 18