Cost and Quality in Crowdsourcing Workflows

被引:1
|
作者
Helouet, Loic [1 ]
Miklos, Zoltan [2 ]
Singh, Rituraj [2 ]
机构
[1] Univ Rennes 1, INRIA Rennes, Rennes, France
[2] Univ Rennes 1, Rennes, France
来源
APPLICATION AND THEORY OF PETRI NETS AND CONCURRENCY (PETRI NETS 2021) | 2021年 / 12734卷
关键词
Crowdsourcing; Data-centric workflows;
D O I
10.1007/978-3-030-76983-3_3
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Crowdsourcing platforms provide tools to replicate and distribute micro tasks (simple, independent work units) to crowds and assemble results. However, real-life problems are often complex: they require to collect, organize or transform data, with quality and costs constraints. This work considers dynamic realization policies for complex crowdsourcing tasks. Workflows provide ways to organize a complex task in phases and guide its realization. The challenge is then to deploy a workflow on a crowd, i.e., allocate workers to phases so that the overall workflow terminates, with good accuracy of results and at a reasonable cost. Standard "static" allocation of work in crowdsourcing affects a fixed number of workers per micro-task to realize and aggregates the results. We define new dynamic worker allocation techniques that consider progress in a workflow, quality of synthesized data, and remaining budget. Evaluation on a benchmark shows that dynamic approaches outperform static ones in terms of cost and accuracy.
引用
收藏
页码:33 / 54
页数:22
相关论文
共 50 条
  • [31] Crowdsourcing Linked Data Quality Assessment
    Acosta, Maribel
    Zaveri, Amrapali
    Simperl, Elena
    Kontokostas, Dimitris
    Auer, Soeren
    Lehmann, Jens
    SEMANTIC WEB - ISWC 2013, PART II, 2013, 8219 : 260 - 276
  • [32] Adversarial Attacks on Crowdsourcing Quality Control
    Checco, Alessandro
    Bates, Jo
    Demartini, Gianluca
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2020, 67 : 375 - 408
  • [33] CROWDSOURCING SUBJECTIVE IMAGE QUALITY EVALUATION
    Ribeiro, Flavio
    Florencio, Dinei
    Nascimento, Vtor
    2011 18TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2011,
  • [34] Crowdsourcing Quality-of-Experience Assessments
    Hossfeld, Tobias
    Keimel, Christian
    Timmerer, Christian
    COMPUTER, 2014, 47 (09) : 98 - 102
  • [35] CSQuaRE: Approach for Quality Control in Crowdsourcing
    Sanagavarapu, Lalit Mohan
    WEB ENGINEERING (ICWE 2017), 2017, 10360 : 592 - 599
  • [36] Expertise screening in crowdsourcing image quality
    Hosu, Vlad
    Lin, Hanhe
    Saupe, Dietmar
    2018 TENTH INTERNATIONAL CONFERENCE ON QUALITY OF MULTIMEDIA EXPERIENCE (QOMEX), 2018, : 276 - 281
  • [37] Crowdsourcing for Evaluating Machine Translation Quality
    Goto, Shinsuke
    Lin, Donghui
    Ishida, Toru
    LREC 2014 - NINTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2014, : 3456 - 3463
  • [38] Crowdsourcing label quality: a theoretical analysis
    Wang Wei
    Zhou Zhi-Hua
    SCIENCE CHINA-INFORMATION SCIENCES, 2015, 58 (11) : 1 - 12
  • [39] Crowdsourcing label quality: a theoretical analysis
    WANG Wei
    ZHOU Zhi-Hua
    ScienceChina(InformationSciences), 2015, 58 (11) : 113 - 124
  • [40] Knowledge Enhanced Quality Estimation for Crowdsourcing
    Wang, Shaofei
    Dang, Depeng
    Guo, Zixian
    Chen, Chuangxia
    Yu, Wenhui
    IEEE ACCESS, 2019, 7 : 106693 - 106703