Paying Attention to Inattentive Survey Respondents

被引:63
作者
Alvarez, R. Michael [1 ]
Atkeson, Lonna Rae [2 ]
Levin, Ines [3 ]
Li, Yimeng [4 ]
机构
[1] CALTECH, Polit Sci, Pasadena, CA 91125 USA
[2] Univ New Mexico, Polit Sci, Albuquerque, NM 87131 USA
[3] Univ Calif Irvine, Polit Sci, Irvine, CA 92717 USA
[4] CALTECH, Div Humanities & Social Sci, Pasadena, CA 91125 USA
关键词
survey design; list experiments; survey response; satisficing; instructed-response items; trap questions; sensitive questions; STATISTICAL-ANALYSIS; REGRESSION; RESPONSES; SHIRKERS; VALIDITY;
D O I
10.1017/pan.2018.57
中图分类号
D0 [政治学、政治理论];
学科分类号
0302 ; 030201 ;
摘要
Does attentiveness matter in survey responses? Do more attentive survey participants give higher quality responses? Using data from a recent online survey that identified inattentive respondents using instructed response items, we demonstrate that ignoring attentiveness provides a biased portrait of the distribution of critical political attitudes and behavior. We show that this bias occurs in the context of both typical closed ended questions and in list experiments. Inattentive respondents are common and are more prevalent among the young and less educated. Those who do not pass the trap questions interact with the survey instrument in distinctive ways: they take less time to respond; are more likely to report nonattitudes; and display lower consistency in their reported choices. Inattentiveness does not occur completely at random and failing to properly account for it may lead to inaccurate estimates of the prevalence of key political attitudes and behaviors, of both sensitive and more prosaic nature.
引用
收藏
页码:145 / 162
页数:18
相关论文
共 48 条
[21]  
DROITCOUR J, 1991, WILEY S PRO, P185
[22]  
Eady G, 2016, REPLICATION DATA STA, DOI [10.7910/DVN/PZKBUX, DOI 10.7910/DVN/PZKBUX]
[23]   The Statistical Analysis of Misreporting on Sensitive Survey Questions [J].
Eady, Gregory .
POLITICAL ANALYSIS, 2017, 25 (02) :241-259
[24]  
Edwards A. L., 1957, SOCIAL DESIRABILITY
[25]   SOCIAL DESIRABILITY BIAS AND THE VALIDITY OF INDIRECT QUESTIONING [J].
FISHER, RJ .
JOURNAL OF CONSUMER RESEARCH, 1993, 20 (02) :303-315
[26]   WHAT CAN WE LEARN WITH STATISTICAL TRUTH SERUM? DESIGN AND ANALYSIS OF THE LIST EXPERIMENT [J].
Glynn, Adam N. .
PUBLIC OPINION QUARTERLY, 2013, 77 :159-172
[27]   Total Survey Error: Past, Present, and Future [J].
Groves, Robert M. ;
Lyberg, Lars .
PUBLIC OPINION QUARTERLY, 2010, 74 (05) :849-879
[28]   Detecting and Deterring Insufficient Effort Responding to Surveys [J].
Huang, Jason L. ;
Curran, Paul G. ;
Keeney, Jessica ;
Poposki, Elizabeth M. ;
DeShon, Richard P. .
JOURNAL OF BUSINESS AND PSYCHOLOGY, 2012, 27 (01) :99-114
[29]   Multivariate Regression Analysis for the Item Count Technique [J].
Imai, Kosuke .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2011, 106 (494) :407-416
[30]   Ascertaining the validity of individual protocols from Web-based personality inventories [J].
Johnson, JA .
JOURNAL OF RESEARCH IN PERSONALITY, 2005, 39 (01) :103-129