Exploring the Quality, Efficiency, and Representative Nature of Responses Across Multiple Survey Panels

被引:10
作者
Bentley, Frank [1 ]
O'Neill, Kathleen [1 ]
Quehl, Katie [1 ]
Lottridge, Danielle [2 ]
机构
[1] Yahoo Verizon Media, Sunnyvale, CA 94089 USA
[2] Univ Auckland, Auckland, New Zealand
来源
PROCEEDINGS OF THE 2020 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'20) | 2020年
关键词
Survey; MTurk; SurveyMonkey; Representative; MECHANICAL TURK;
D O I
10.1145/3313831.3376671
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
A common practice in HCI research is to conduct a survey to understand the generalizability of findings from smaller-scale qualitative research. These surveys are typically deployed to convenience samples, on low-cost platforms such as Amazon's Mechanical Turk or Survey Monkey, or to more expensive market research panels offered by a variety of premium firms. Costs can vary widely, from hundreds of dollars to tens of thousands of dollars depending on the platform used. We set out to understand the accuracy of ten different survey platforms/panels compared to ground truth data for a total of 6,007 respondents on 80 different aspects of demographic and behavioral questions. We found several panels that performed significantly better than others on certain topics, while different panels provided longer and more relevant open-ended responses. Based on this data, we highlight the benefits and pitfalls of using a variety of survey distribution options in terms of the quality, efficiency, and representative nature of the respondents and the types of responses that can be obtained.
引用
收藏
页数:12
相关论文
共 47 条
[1]  
Abrams S.J., 2016, The New York Times
[2]  
[Anonymous], Annual Estimates of the Population for the US and States, and for Puerto Rico
[3]  
Antin J., 2012, Proceedings Fo the SIGCHI Conference on Human Factors in Computing Systems, P2925, DOI DOI 10.1145/2207676.2208699
[4]  
API, 2019, PAY NEWS WHY PEOP SU
[5]  
Bacon Jr Perry, 2018, CAN CONSERVATIVES EV
[6]   The viability of crowdsourcing for survey research [J].
Behrend, Tara S. ;
Sharek, David J. ;
Meade, Adam W. ;
Wiebe, Eric N. .
BEHAVIOR RESEARCH METHODS, 2011, 43 (03) :800-813
[7]  
Bentley F.R., 2017, C HUM FACTORS COMPUT, P1092
[8]   "If a person is emailing you, it just doesn't make sense": Exploring Changing Consumer Behaviors in Email [J].
Bentley, Frank ;
Daskalova, Nediyana ;
Andalibi, Nazanin .
PROCEEDINGS OF THE 2017 ACM SIGCHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS (CHI'17), 2017, :85-95
[9]   Mobile Commerce Competitive Advantage: A Quantitative Study of Variables that Predict M-Commerce Purchase Intentions [J].
Blaise, Robert ;
Halloran, Michael ;
Muchnick, Marc .
JOURNAL OF INTERNET COMMERCE, 2018, 17 (02) :96-114
[10]   Amazon's Mechanical Turk: A New Source of Inexpensive, Yet High-Quality, Data? [J].
Buhrmester, Michael ;
Kwang, Tracy ;
Gosling, Samuel D. .
PERSPECTIVES ON PSYCHOLOGICAL SCIENCE, 2011, 6 (01) :3-5