What makes or breaks competitive research proposals? A mixed-methods analysis of research grant evaluation reports

被引:4
作者
Hren, Darko [1 ]
Pina, David G. [2 ]
Norman, Christopher R. [3 ]
Marusic, Ana [4 ,5 ]
机构
[1] Univ Split, Fac Humanities & Social Sci, Dept Psychol, Split, Croatia
[2] European Commiss, European Res Execut Agcy, Brussels, Belgium
[3] Sciome LLC, Res Triangle Pk, NC USA
[4] Univ Split, Sch Med, Dept Res Biomed & Hlth, Split, Croatia
[5] Univ Split, Sch Med, Ctr Evidence Based Med, Split, Croatia
关键词
European Commission; Machine learning; Marie Curie Actions; Peer review outcome; Qualitive analysis; Research grants; LINGUISTIC ANALYSIS; DECISION-MAKING; NEGATIVITY BIAS; PEER; REVIEWERS;
D O I
10.1016/j.joi.2022.101289
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The evaluation of grant proposals is an essential aspect of competitive research funding. Funding bodies and agencies rely in many instances on external peer reviewers for grant assessment. Most of the research available is about quantitative aspects of this assessment, and there is little evidence from qualitative studies. We used a combination of machine learning and qualitative analysis methods to analyse the reviewers' comments in evaluation reports from 3667 grant applications to the Initial Training Networks (ITN) of the Marie Curie Actions under the Seventh Framework Programme (FP7). Our results show that the reviewers' comments for each evaluation criterion were aligned with the Action's prespecified criteria and that the evaluation outcome was more influenced by the proposals' weaknesses than by their strengths.
引用
收藏
页数:17
相关论文
共 44 条
[1]  
Altman D.G., 2000, STAT CONFIDENCE
[2]   Quantifying the quality of peer reviewers through Zipf's law [J].
Ausloos, Marcel ;
Nedic, Olgica ;
Fronczak, Agata ;
Fronczak, Piotr .
SCIENTOMETRICS, 2016, 106 (01) :347-368
[3]   The important thing is not to win, it is to take part: What if scientists benefit from participating in research grant competitions? [J].
Ayoubi, Charles ;
Pezzoni, Michele ;
Visentin, Fabiana .
RESEARCH POLICY, 2019, 48 (01) :84-97
[4]   A Game Theoretic Approach to Peer Review of Grant Proposals [J].
Bayindir, Esra Eren ;
Gurdal, Mehmet Yigit ;
Saglam, Ismail .
JOURNAL OF INFORMETRICS, 2019, 13 (04)
[5]   Closed versus open reviewing of journal manuscripts: how far do comments differ in language use? [J].
Bornmann, Lutz ;
Wolf, Markus ;
Daniel, Hans-Dieter .
SCIENTOMETRICS, 2012, 91 (03) :843-856
[6]   Large-scale language analysis of peer review reports [J].
Buljan, Ivan ;
Garcia-Costa, Daniel ;
Grimaldo, Francisco ;
Squazzoni, Flaminio ;
Marusic, Ana .
ELIFE, 2020, 9 :1-10
[7]   The negativity bias: Conceptualization, quantification, and individual differences [J].
Cacioppo, John T. ;
Cacioppo, Stephanie ;
Gollan, Jackie K. .
BEHAVIORAL AND BRAIN SCIENCES, 2014, 37 (03) :309-310
[8]   Peer review for improving the quality of grant applications [J].
Demicheli, V ;
Di Pietrantonj, C. .
COCHRANE DATABASE OF SYSTEMATIC REVIEWS, 2007, (02)
[9]   Effectiveness of research grants funded by European Research Council and Polish National Science Centre [J].
Dziezyc, Maciej ;
Kazienko, Przemyslaw .
JOURNAL OF INFORMETRICS, 2022, 16 (01)
[10]   A scoping review of simulation models of peer review [J].
Feliciani, Thomas ;
Luo, Junwen ;
Ma, Lai ;
Lucas, Pablo ;
Squazzoni, Flaminio ;
Marusic, Ana ;
Shankar, Kalpana .
SCIENTOMETRICS, 2019, 121 (01) :555-594