Understanding Malicious Behavior in Crowdsourcing Platforms: The Case of Online Surveys

被引:129
作者
Gadiraju, Ujwal [1 ]
Kawase, Ricardo [1 ]
Dietze, Stefan [1 ]
Demartini, Gianluca [2 ]
机构
[1] Leibniz Univ Hannover, Res Ctr L3S, Hannover, Germany
[2] Univ Sheffield, Informat Sch, Sheffield, S Yorkshire, England
来源
CHI 2015: PROCEEDINGS OF THE 33RD ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS | 2015年
关键词
Crowdsourcing; Microtasks; Online Surveys; User Behavior; Malicious Intent;
D O I
10.1145/2702123.2702443
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Crowdsourcing is increasingly being used as a means to tackle problems requiring human intelligence. With the ever-growing worker base that aims to complete microtasks on crowdsourcing platforms in exchange for financial gains, there is a need for stringent mechanisms to prevent exploitation of deployed tasks. Quality control mechanisms need to accommodate a diverse pool of workers, exhibiting a wide range of behavior. A pivotal step towards fraud-proof task design is understanding the behavioral patterns of microtask workers. In this paper, we analyze the prevalent malicious activity on crowdsourcing platforms and study the behavior exhibited by trustworthy and untrustworthy workers, particularly on crowdsourced surveys. Based on our analysis of the typical malicious activity, we define and identify different types of workers in the crowd, propose a method to measure malicious activity, and finally present guidelines for the efficient design of crowdsourced surveys.
引用
收藏
页码:1631 / 1640
页数:10
相关论文
共 27 条
  • [1] [Anonymous], 2010, CHI EA
  • [2] [Anonymous], P INTERSPEECH
  • [3] [Anonymous], 2013, Proceedings of the 2013 conference on Computer supported cooperative work (CSCW '13), DOI DOI 10.1145/2441776.2441923
  • [4] [Anonymous], 2011, AMCIS
  • [5] [Anonymous], 2008, Convergence: The International Journal of Research into New Media Technologies, DOI DOI 10.1177/1354856507084420
  • [6] [Anonymous], 2011, P 2011 ANN C HUM FAC, DOI 10.1145/1979742.1979826
  • [7] [Anonymous], 2010, Proceedings of the ACM SIGKDD workshop on human computation, DOI 10.1145/1837885.1837906
  • [8] Baba Y., 2013, 25 IAAI C
  • [9] The viability of crowdsourcing for survey research
    Behrend, Tara S.
    Sharek, David J.
    Meade, Adam W.
    Wiebe, Eric N.
    [J]. BEHAVIOR RESEARCH METHODS, 2011, 43 (03) : 800 - 813
  • [10] Difallah Djellel Eddine, 2012, CROWDSEARCH, P26