Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit

被引:153
作者
Vatavu, Radu-Daniel [1 ]
Wobbrock, Jacob O. [2 ]
机构
[1] Univ Stefan Cel Mare Suceava, Suceava 720229, Romania
[2] Univ Washington, Informat Sch, DUB Grp, Seattle, WA 98195 USA
来源
CHI 2015: PROCEEDINGS OF THE 33RD ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS | 2015年
关键词
Guessability study; agreement rate; methodology; statistical test; user-defined gestures; disagreement; coagreement;
D O I
10.1145/2702123.2702223
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
We address in this work the process of agreement rate analysis for characterizing the level of consensus between participants' proposals elicited during guessability studies. Two new measures, i.e., disagreement rate for referents and coagreement rate between referents, are proposed to accompany the widely-used agreement rate formula of Wobbrock et al. [37] when reporting participants' consensus for symbolic input. A statistical significance test for comparing the agreement rates of k >= 2 referents is presented in analogy with Cochran's success/failure Q test [5], for which we express the test statistic in terms of agreement and coagreement rates. We deliver a toolkit to assist practitioners to compute agreement, disagreement, and coagreement rates, and run statistical tests for agreement rates at p = .05, .01, and .001 levels of significance. We validate our theoretical development of agreement rate analysis in relation with several previously published elicitation studies. For example, when we present the probability distribution function of the agreement rate measure, we also use it (1) to explain the magnitude of agreement rates previously reported in the literature, and (2) to propose qualitative interpretations for agreement rates, in analogy with Cohen's guidelines for effect sizes [6]. We also re-examine previously published elicitation data from the perspective of the agreement rate test statistic, and highlight new findings on the effect of referents over agreement rates, unattainable prior to this work. We hope that our contributions will advance the current knowledge in agreement rate analysis, providing researchers and practitioners with new techniques and tools to help them understand user-elicited data at deeper levels of detail and sophistication.
引用
收藏
页码:1325 / 1334
页数:10
相关论文
共 34 条
[21]  
Mauney D., 2010, CHI '10 Extended Abstracts on Human Factors in Computing Systems, CHI EA '10, New York, NY, USA, P4015, DOI 10.1145/1753846.1754095
[22]   NOTE ON THE SAMPLING ERROR OF THE DIFFERENCE BETWEEN CORRELATED PROPORTIONS OR PERCENTAGES [J].
McNemar, Quinn .
PSYCHOMETRIKA, 1947, 12 (02) :153-157
[23]  
Morris M. R., 2012, Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces, ITS '12, P95, DOI [10.1145/2396636.2396651, DOI 10.1145/2396636.2396651]
[24]  
Morris M.R., 2014, INTERACTIONS, V21, P40, DOI [DOI 10.1145/2591689, 10.1145/2591689]
[25]  
Morris M. R., 2010, Proceedings of Graphics Interface 2010, GI '10, P261
[26]  
Obaid Mohammad, 2012, Social Robotics. 4th International Conference (ICSR 2012). Proceedings, P367, DOI 10.1007/978-3-642-34103-8_37
[27]  
Piumsomboon T, 2013, LECT NOTES COMPUT SC, V8118, P282
[28]  
Pyryeskin D., 2012, Proceedings of the 2012 ACM International Conference on Interactive Tabletops and Surfaces, P1, DOI DOI 10.1145/2396636.2396638
[29]  
Ruiz J, 2011, 29TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, P197
[30]  
SEDGEWICK R, 1977, COMPUT SURV, V9, P137, DOI 10.1145/356689.356692