How to evaluate data visualizations across different levels of understanding

被引:35
作者
Burns, Alyxander [1 ]
Xiong, Cindy [2 ]
Franconeri, Steven [2 ]
Cairo, Alberto [3 ]
Mahyar, Narges [1 ]
机构
[1] UMass Amherst, Amherst, MA 01003 USA
[2] Northwestern Univ, Evanston, IL 60208 USA
[3] Univ Miami, Coral Gables, FL 33124 USA
来源
2020 IEEE WORKSHOP ON EVALUATION AND BEYOND - METHODOLOGICAL APPROACHES TO VISUALIZATION (BELIV 2020) | 2020年
关键词
Human-centered computing; Visualization; Visualization design; evaluation methods; BLOOMS TAXONOMY; TASK TAXONOMY; DESIGN; SPACE;
D O I
10.1109/BELIV51497.2020.00010
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Understanding a visualization is a multi-level process. A reader must extract and extrapolate from numeric facts, understand how those facts apply to both the context of the data and other potential contexts, and draw or evaluate conclusions from the data. A well-designed visualization should support each of these levels of understanding. We diagnose levels of understanding of visualized data by adapting Bloom's taxonomy, a common framework from the education literature. We describe each level of the framework and provide examples for how it can be applied to evaluate the efficacy of data visualizations along six levels of knowledge acquisition knowledge, comprehension, application, analysis, synthesis, and evaluation. We present three case studies showing that this framework expands on existing methods to comprehensively measure how a visualization design facilitates a viewer's understanding of visualizations. Although Bloom's original taxonomy suggests a strong hierarchical structure for some domains, we found few examples of dependent relationships between performance at different levels for our three case studies. If this level-independence holds across new tested visualizations, the taxonomy could serve to inspire more targeted evaluations of levels of understanding that are relevant to a communication goal.
引用
收藏
页码:19 / 28
页数:10
相关论文
共 55 条
[31]   A revision of Bloom's taxonomy: An overview [J].
Krathwohl, DR .
THEORY INTO PRACTICE, 2002, 41 (04) :212-+
[32]   VLAT: Development of a Visualization Literacy Assessment Test [J].
Lee, Sukwon ;
Kim, Sung-Hee ;
Kwon, Bum Chul .
IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2017, 23 (01) :551-560
[33]  
Leo S., 2019, MISTAKES WEVE DRAWN
[34]  
Mahyar N., 2015, WORKSHOP PERSONAL VI, V3, P2
[35]   A taxonomy of visualization tasks for the analysis of biological pathway data [J].
Murray, Paul ;
McGee, Fintan ;
Forbes, Angus G. .
BMC BIOINFORMATICS, 2017, 18
[36]   Utilizing Bloom's taxonomy to design a substance use disorders course for health professions students [J].
Muzyk, Andrew J. ;
Tew, Chris ;
Thomas-Fannin, Allie ;
Dayal, Sanjai ;
Maeda, Reina ;
Schramm-Sapyta, Nicole ;
Andolsek, Kathryn ;
Holmer, Shelley .
SUBSTANCE ABUSE, 2018, 39 (03) :348-353
[37]   Using Bloom's Taxonomy to Teach Critical Thinking Skills to Business Students [J].
Nentl, Nancy ;
Zietlow, Ruth .
COLLEGE & UNDERGRADUATE LIBRARIES, 2008, 15 (1-2) :159-172
[38]   Toward measuring visualization insight [J].
North, C .
IEEE COMPUTER GRAPHICS AND APPLICATIONS, 2006, 26 (03) :6-9
[39]   A comparison of benchmark task and insight evaluation methods for information visualization [J].
North, Chris ;
Saraiya, Purvi ;
Duca, Karen .
INFORMATION VISUALIZATION, 2011, 10 (03) :162-181
[40]   Using the Short Graph Literacy Scale to Predict Precursors of Health Behavior Change [J].
Okan, Yasmina ;
Janssen, Eva ;
Galesic, Mirta ;
Waters, Erika A. .
MEDICAL DECISION MAKING, 2019, 39 (03) :183-195