Crowdsourcing research questions in science

被引:24
作者
Beck, Susanne [1 ,2 ]
Brasseur, Tiare-Maria [1 ,2 ]
Poetz, Marion [1 ,2 ]
Sauermann, Henry [3 ]
机构
[1] Ludwig Boltzmann Gesell, Open Innovat Sci Ctr, LBG OIS Ctr, Nussdorfer Str 64, A-1090 Vienna, Austria
[2] Copenhagen Business Sch, Dept Strategy & Innovat, Kilevej 14A, DK-2000 Frederiksberg, Denmark
[3] European Sch Management & Technol, Schlosspl 1, D-10178 Berlin, Germany
关键词
Crowd science; Citizen science; Crowdsourcing; Problem solving; Problem finding; Agenda setting; Organization of science; CENTERED OUTCOMES RESEARCH; CITIZEN SCIENCE; OPEN INNOVATION; KNOWLEDGE; CREATIVITY; PATIENT; COMMUNITIES; SOFTWARE; PRODUCT; PROJECT;
D O I
10.1016/j.respol.2022.104491
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
Scientists are increasingly crossing the boundaries of the professional system by involving the general public (the crowd) directly in their research. However, this crowd involvement tends to be confined to empirical work and it is not clear whether and how crowds can also be involved in conceptual stages such as formulating the questions that research is trying to address. Drawing on five different "paradigms" of crowdsourcing and related mechanisms, we first discuss potential merits of involving crowds in the formulation of research questions (RQs). We then analyze data from two crowdsourcing projects in the medical sciences to describe key features of RQs generated by crowd members and compare the quality of crowd contributions to that of RQs generated in the conventional scientific process. We find that the majority of crowd contributions are problem restatements that can be useful to assess problem importance but provide little guidance regarding potential causes or solutions. At the same time, crowd-generated research questions frequently cross disciplinary boundaries by combining elements from different fields within and especially outside medicine. Using evaluations by professional scientists, we find that the average crowd contribution has lower novelty and potential scientific impact than professional research questions, but comparable practical impact. Crowd contributions outperform professional RQs once we apply selection mechanisms at the level of individual contributors or across contributors. Our findings advance research on crowd and citizen science, crowdsourcing and distributed knowledge production, as well as the organization of science. We also inform ongoing policy debates around the involvement of citizens in research in general, and agenda setting in particular.
引用
收藏
页数:21
相关论文
共 116 条
  • [1] CROWDSOURCING AS A SOLUTION TO DISTANT SEARCH
    Afuah, Allan
    Tucci, Christopher L.
    [J]. ACADEMY OF MANAGEMENT REVIEW, 2012, 37 (03) : 355 - 375
  • [2] GENERATING RESEARCH QUESTIONS THROUGH PROBLEMATIZATION
    Alvesson, Mats
    Sandberg, Joergen
    [J]. ACADEMY OF MANAGEMENT REVIEW, 2011, 36 (02) : 247 - 271
  • [3] Amabile T.M., 1996, Creativity in context: The social psychology of creativity, DOI DOI 10.4324/9780429501234
  • [4] Angrist JD, 2009, MOSTLY HARMLESS ECONOMETRICS: AN EMPIRICISTS COMPANION, P1
  • [5] [Anonymous], 2004, Facilitating interdisciplinary research
  • [6] Beck S., 2022, IND INNOV, V29, P1
  • [7] Patients Driving the Clinical Trial Designs - Democracy in Clinical Research
    Bhardwaj, Payal
    Kumar, Jeba
    Yadav, Raj Kumar
    [J]. REVIEWS ON RECENT CLINICAL TRIALS, 2019, 14 (04) : 237 - 246
  • [8] When citizen science is public relations
    Blacker, Sarah
    Kimura, Aya H.
    Kinchy, Abby
    [J]. SOCIAL STUDIES OF SCIENCE, 2021, 51 (05) : 780 - 796
  • [9] Next Steps for Citizen Science
    Bonney, Rick
    Shirk, Jennifer L.
    Phillips, Tina B.
    Wiggins, Andrea
    Ballard, Heidi L.
    Miller-Rushing, Abraham J.
    Parrish, Julia K.
    [J]. SCIENCE, 2014, 343 (6178) : 1436 - 1437
  • [10] Borch Christian., 2012, POLITICS CROWDS ALTE, DOI 10.1017/CBO9780511842160