How to evaluate data visualizations across different levels of understanding

被引:35
作者
Burns, Alyxander [1 ]
Xiong, Cindy [2 ]
Franconeri, Steven [2 ]
Cairo, Alberto [3 ]
Mahyar, Narges [1 ]
机构
[1] UMass Amherst, Amherst, MA 01003 USA
[2] Northwestern Univ, Evanston, IL 60208 USA
[3] Univ Miami, Coral Gables, FL 33124 USA
来源
2020 IEEE WORKSHOP ON EVALUATION AND BEYOND - METHODOLOGICAL APPROACHES TO VISUALIZATION (BELIV 2020) | 2020年
关键词
Human-centered computing; Visualization; Visualization design; evaluation methods; BLOOMS TAXONOMY; TASK TAXONOMY; DESIGN; SPACE;
D O I
10.1109/BELIV51497.2020.00010
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Understanding a visualization is a multi-level process. A reader must extract and extrapolate from numeric facts, understand how those facts apply to both the context of the data and other potential contexts, and draw or evaluate conclusions from the data. A well-designed visualization should support each of these levels of understanding. We diagnose levels of understanding of visualized data by adapting Bloom's taxonomy, a common framework from the education literature. We describe each level of the framework and provide examples for how it can be applied to evaluate the efficacy of data visualizations along six levels of knowledge acquisition knowledge, comprehension, application, analysis, synthesis, and evaluation. We present three case studies showing that this framework expands on existing methods to comprehensively measure how a visualization design facilitates a viewer's understanding of visualizations. Although Bloom's original taxonomy suggests a strong hierarchical structure for some domains, we found few examples of dependent relationships between performance at different levels for our three case studies. If this level-independence holds across new tested visualizations, the taxonomy could serve to inspire more targeted evaluations of levels of understanding that are relevant to a communication goal.
引用
收藏
页码:19 / 28
页数:10
相关论文
共 55 条
[1]   A Task Taxonomy for Network Evolution Analysis [J].
Ahn, Jae-wook ;
Plaisant, Catherine ;
Shneiderman, Ben .
IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2014, 20 (03) :365-376
[2]   A knowledge task-based framework for design and evaluation of information visualizations [J].
Amar, R ;
Stasko, J .
IEEE SYMPOSIUM ON INFORMATION VISUALIZATION 2004, PROCEEDINGS, 2004, :143-149
[3]   Knowledge precepts for design and evaluation of information visualizations [J].
Amar, RA ;
Stasko, JT .
IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2005, 11 (04) :432-442
[4]  
Andrienko G., 2006, EXPLORATORY ANAL SPA
[5]  
[Anonymous], 2009, P FRONT ED C FIE 200, DOI [DOI 10.1109/FIE.2009.5350598, 10.1109/FIE.2009.5350598, 10.1109/fie.2009.5350598]
[6]  
[Anonymous], 2014, MULTIVARIATE NETWORK, V8380
[7]  
[Anonymous], 1994, INTERNATIoNAL ENcycLoPEDIA oF EDucATIoN
[8]   Visual Literacy in Bloom: Using Bloom's Taxonomy to Support Visual Learning Skills [J].
Arneson, Jessie B. ;
Offerdahl, Erika G. .
CBE-LIFE SCIENCES EDUCATION, 2018, 17 (01)
[9]  
Bagchi S.N., 2014, Journal of Case Research, V5, P57
[10]  
Bateman S, 2010, CHI2010: PROCEEDINGS OF THE 28TH ANNUAL CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, VOLS 1-4, P2573