Understanding peer review of software engineering papers

被引:7
|
作者
Ernst, Neil A. [1 ]
Carver, Jeffrey C. [2 ]
Mendez, Daniel [3 ,4 ]
Torchiano, Marco [5 ]
机构
[1] Univ Victoria, Victoria, BC, Canada
[2] Univ Alabama, Tuscaloosa, AL USA
[3] Blekinge Inst Technol, Karlskrona, Sweden
[4] Fortiss GmbH, Munich, Germany
[5] Politecn Torino, Turin, Italy
关键词
Peer review; Interview; Survey;
D O I
10.1007/s10664-021-10005-5
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Context Peer review is a key activity intended to preserve the quality and integrity of scientific publications. However, in practice it is far from perfect. Objective We aim at understanding how reviewers, including those who have won awards for reviewing, perform their reviews of software engineering papers to identify both what makes a good reviewing approach and what makes a good paper. Method We first conducted a series of interviews with recognised reviewers in the software engineering field. Then, we used the results of those interviews to develop a questionnaire used in an online survey and sent out to reviewers from well-respected venues covering a number of software engineering disciplines, some of whom had won awards for their reviewing efforts. Results We analyzed the responses from the interviews and from 175 reviewers who completed the online survey (including both reviewers who had won awards and those who had not). We report on several descriptive results, including: Nearly half of award-winners (45%) are reviewing 20+ conference papers a year, while 28% of non-award winners conduct that many. The majority of reviewers (88%) are taking more than two hours on journal reviews. We also report on qualitative results. Our findings suggest that the most important criteria of a good review is that it should be factual and helpful, which ranked above others such as being detailed or kind. The most important features of papers that result in positive reviews are a clear and supported validation, an interesting problem, and novelty. Conversely, negative reviews tend to result from papers that have a mismatch between the method and the claims and from papers with overly grandiose claims. Further insights include, if not limited to, that reviewers view data availability and its consistency as being important or that authors need to make their contribution of the work very clear in their paper. Conclusions Based on the insights we gained through our study, we conclude our work by compiling a proto-guideline for reviewing. One hope we associate with our work is to contribute to the ongoing debate and contemporary effort to further improve our peer review models in the future.
引用
收藏
页数:29
相关论文
共 50 条
  • [11] How and why eLife selects papers for peer review
    Behrens, Timothy E.
    Dalal, Yamini
    Harper, Diane M.
    Weigel, Detlef
    Ajijola, Olujimi A.
    Andreotti, Amy H.
    Araujo, Sofia J.
    Banerjee, Utpal
    Bhargava, Balram
    Bi, Yanchao
    Buchel, Christian
    Campelo, Felix
    Cardona, Albert
    Cheah, Kathryn
    Chen, Lu
    Choi, Murim
    Colgin, Laura
    Cooper, Jonathan A.
    Cui, Qiang
    de Lange, Floris
    Desplan, Claude
    Doetsch, Volker
    El-Deiry, Wafik S.
    Franco, Eduardo
    Frank, Michael J.
    Garrett, Wendy S.
    Gold, Joshua, I
    Hauf, Silke
    Huang, Christopher L-H
    Huguenard, John
    James, David E.
    Kana, Bavesh
    Kapahi, Pankaj
    King, Andrew J.
    Kleine-Vehn, Juergen
    Kornmann, Benoit
    Liu, Caigang
    Luo, Huan
    Maduke, Merritt
    Makin, Tamar
    Marquand, Andre F.
    Marston, Adele L.
    Mistry, Pramod
    Moses, Alan
    Nelson, Sacha B.
    Ng, Tony
    Ojala, Paivi
    Perry, George H.
    Poirazi, Panayiota
    Postovit, Lynne-Marie
    ELIFE, 2024, 13
  • [12] Open peer review by a selected-papers network
    Lee, Christopher
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2012, 6
  • [13] Collaboration, peer review and open source software
    Johnson, Justin P.
    INFORMATION ECONOMICS AND POLICY, 2006, 18 (04) : 477 - 497
  • [14] Peer review of presentations through examination software
    Pound, Melanie W.
    Carroll, Dawn W.
    Nye, Ann Marie
    CURRENTS IN PHARMACY TEACHING AND LEARNING, 2022, 14 (02) : 240 - 244
  • [15] Understanding the Peer Review Process and Responding to Reviewers
    Tucker, Katherine
    Davis, Teresa
    Duggan, Christopher
    Odle, Jack
    ANNALS OF NUTRITION AND METABOLISM, 2023, 79 : 261 - 261
  • [16] Peer review of laboratory reports for engineering students
    Andersson, Magnus
    Weurlander, Maria
    EUROPEAN JOURNAL OF ENGINEERING EDUCATION, 2019, 44 (03) : 417 - 428
  • [17] Understanding the peer review endeavor in scientific publishing
    Zhang, Guangyao
    Xu, Shenmeng
    Sun, Yao
    Jiang, Chunlin
    Wang, Xianwen
    JOURNAL OF INFORMETRICS, 2022, 16 (02)
  • [18] Correlations between submission and acceptance of papers in peer review journals
    Marcel Ausloos
    Olgica Nedič
    Aleksandar Dekanski
    Scientometrics, 2019, 119 : 279 - 302
  • [19] Correlations between submission and acceptance of papers in peer review journals
    Ausloos, Marcel
    Nedic, Olgica
    Dekanski, Aleksandar
    SCIENTOMETRICS, 2019, 119 (01) : 279 - 302
  • [20] Writing and contribution of geography papers from the perspective of peer review
    Li J.
    He S.
    Zhu X.
    Zhu, Xiaohua (zhuxh@igsnrr.ac.cn), 1717, Science Press (72): : 1717 - 1728