Crowd IQ: Measuring the Intelligence of Crowdsourcing Platforms

被引:0
|
作者
Kosinski, Michal [1 ]
Bachrach, Yoram [1 ]
Kasneci, Gjergji [1 ]
Van-Gael, Jurgen [1 ]
Graepel, Thore [1 ]
机构
[1] Univ Cambridge, Psychometr Ctr, Cambridge CB2 1TN, England
来源
PROCEEDINGS OF THE 3RD ANNUAL ACM WEB SCIENCE CONFERENCE, 2012 | 2012年
关键词
Crowdsourcing; Psychometrics; Incentive Schemes; SYSTEMS; WORLD;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We measure crowdsourcing performance based on a standard IQ questionnaire, and examine Amazon's Mechanical Turk (AMT) performance under different conditions. These include variations of the payment amount offered, the way incorrect responses affect workers' reputations, threshold reputation scores of participating AMT workers, and the number of workers per task. We show that crowds composed of workers of high reputation achieve higher performance than low reputation crowds, and the effect of the amount of payment is non-monotone-both paying too much and too little affects performance. Furthermore, higher performance is achieved when the task is designed such that incorrect responses can decrease workers' reputation scores. Using majority vote to aggregate multiple responses to the same task can significantly improve performance, which can be further boosted by dynamically allocating workers to tasks in order to break ties.
引用
收藏
页码:151 / 160
页数:10
相关论文
共 50 条
  • [31] Effect of Crowd Voting on Participation in Crowdsourcing Contests
    Chen, Liang
    Xu, Pei
    Liu, De
    JOURNAL OF MANAGEMENT INFORMATION SYSTEMS, 2020, 37 (02) : 510 - 535
  • [32] Comparing two crowdsourcing platforms: assessing their potential for mapping Antarctica
    Gulnerman, Ayse Giz
    Senel, Muge
    Gokduman, Ozan Deniz
    EARTH SCIENCE INFORMATICS, 2024, 17 (05) : 4655 - 4673
  • [33] Collaborative Software Development Platforms for Crowdsourcing
    Peng, Xin
    Babar, Muhammad Ali
    Ebert, Christof
    IEEE SOFTWARE, 2014, 31 (02) : 30 - 36
  • [34] Affect and Creative Performance on Crowdsourcing Platforms
    Morris, Robert R.
    Dontcheva, Mira
    Finkelstein, Adam
    Gerber, Elizabeth
    2013 HUMAINE ASSOCIATION CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2013, : 67 - 72
  • [35] A Fair Incentive Mechanism for Crowdsourcing in Crowd Sensing
    Zhu, Xuan
    An, Jian
    Yang, Maishun
    Xiang, Lele
    Yang, Qiangwei
    Gui, Xiaolin
    IEEE INTERNET OF THINGS JOURNAL, 2016, 3 (06): : 1364 - 1372
  • [36] DESIGNING COMPLEX CROWDSOURCING APPLICATIONS COVERING MULTIPLE PLATFORMS AND TASKS
    Bozzon, Alessandro
    Brambilla, Marco
    Ceri, Stefano
    Mauri, Andrea
    Volonterio, Riccardo
    JOURNAL OF WEB ENGINEERING, 2015, 14 (5-6): : 443 - 473
  • [37] Optimal Fee Structures of Crowdsourcing Platforms
    Wen, Zhong
    Lin, Lihui
    DECISION SCIENCES, 2016, 47 (05) : 820 - 850
  • [38] Managing the Crowd: Towards a Taxonomy of Crowdsourcing Processes
    Geiger, David
    Seedorf, Stefan
    Schulze, Thimo
    Nickerson, Robert
    Schader, Martin
    AMCIS 2011 PROCEEDINGS, 2011,
  • [39] Rating Mechanisms for Sustainability of Crowdsourcing Platforms
    Qiu, Chenxi
    Squicciarini, Anna.
    Rajtmajer, Sarah
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2003 - 2012
  • [40] The Adoption of Crowdsourcing Platforms in South Africa
    Chuene, Daniel
    Mtsweni, Jabu
    2015 IST-AFRICA CONFERENCE, 2015,