Peer review-based selection decisions in individual research funding, applicants' publication strategies and performance: The case of the ERC Starting Grants

被引:28
作者
Neufeld, Joerg [1 ]
Huber, Nathalie [1 ]
Wegner, Antje [1 ]
机构
[1] Inst Res Informat & Qual Assurance iFQ, D-10117 Berlin, Germany
基金
欧洲研究理事会;
关键词
D O I
10.1093/reseval/rvt014
中图分类号
G25 [图书馆学、图书馆事业]; G35 [情报学、情报工作];
学科分类号
1205 ; 120501 ;
摘要
This article investigates the dependence of funding decisions on past publication performance amongst applicants for the Starting Grants Programme, offered by the European Research Council. Publication data will be contrasted with individual publication strategies generated by an online survey. The empirical results will be discussed against the background of evaluation studies on similar funding schemes for young scientists (Boehringer Ingelheim Fonds; Bornmann and Daniel 2007, Individual Grant for the Advancement of Research Leaders: Melin and Danell 2006, and the Emmy Noether-Programme (ENP): Hornbostel et al. 2009, Neufeld and von Ins 2011]. Most of these studies focus on the respective peer review system by bibliometrically investigating its ability to select the 'best' applicants for funding, although they come to different results. However, an overview of the studies reveals that potential differences in the past publication performance between approved and rejected applicants depend not only on selection decisions (or the peer review) but also on further programme-specific factors such as finiteness/openness of the overall budget and the level of self-or 'pre-selection' amongst potential applicants. As the European Research Council Starting Grants is a highly prestigious international funding programme for young scientists with demanding eligibility requirements and low acceptance rates, it constitutes a unique case study for further investigating the relationship between peer review-based selection decisions and applicants' publication performance.
引用
收藏
页码:237 / 247
页数:11
相关论文
共 12 条
  • [1] Convergent validation of peer review decisions using the h index -: Extent of and reasons for type I and type II errors
    Bornmann, Lutz
    Daniel, Hans-Dieter
    [J]. JOURNAL OF INFORMETRICS, 2007, 1 (03) : 204 - 213
  • [2] Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes
    Bornmann, Lutz
    Wallon, Gerlind
    Ledin, Anna
    [J]. PLOS ONE, 2008, 3 (10):
  • [3] Bibliometrics as a Performance Measurement Tool for Research Evaluation: The Case of Research Funded by the National Cancer Institute of Canada
    Campbell, David
    Picard-Aitken, Michelle
    Cote, Gregoire
    Caruso, Julie
    Valentim, Rodolfo
    Edmonds, Stuart
    Williams, Gregory Thomas
    Macaluso, Benoit
    Robitaille, Jean-Pierre
    Bastien, Nicolas
    Laframboise, Marie-Claude
    Lebeau, Louis-Michel
    Mirabel, Philippe
    Lariviere, Vincent
    Archambault, Eric
    [J]. AMERICAN JOURNAL OF EVALUATION, 2010, 31 (01) : 66 - 83
  • [4] European Commission, 2012, C20124562 EUR COMM
  • [5] European Commission, 2008, C20083673 EUR COMM
  • [6] The history and meaning of the journal impact factor
    Garfield, E
    [J]. JAMA-JOURNAL OF THE AMERICAN MEDICAL ASSOCIATION, 2006, 295 (01): : 90 - 93
  • [7] There are neither "king" nor "crown" in scientometrics: Comments on a supposed "alternative" method of normalization
    Gingras, Yves
    Lariviere, Vincent
    [J]. JOURNAL OF INFORMETRICS, 2011, 5 (01) : 226 - 227
  • [8] Funding of young scientist and scientific excellence
    Hornbostel, Stefan
    Boehmer, Susan
    Klingsporn, Bernd
    Neufeld, Joerg
    von Ins, Markus
    [J]. SCIENTOMETRICS, 2009, 79 (01) : 171 - 190
  • [9] Melin G., 2006, Science and Public Policy, V33, P702, DOI 10.3152/147154306781778579
  • [10] Neufeld J., 2012, RES EVALUAT, V21, P1