Developing guidelines for assessing visual analytics environments

In this article, we develop guidelines for evaluating visual analytics environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews, we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We also looked at guidelines developed by researchers in various domains and synthesized the results from these three efforts into an initial set for use by others in the community. One challenge for future visual analytics systems is to help in the generation of reports. In our user study, we also worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 From these two efforts, we produced some initial guidelines for evaluating visual analytics environments and for the evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope as the visual analytics systems we evaluated were used in specific tasks. We propose these guidelines as a starting point for the Visual Analytics Community.

[1]  Dylgg,et al.  Guidelines for Designing Information Visualization Applications , 1999 .

[2]  Melanie Tory,et al.  Evaluating Visualizations: Do Expert Reviews Work? , 2005, IEEE Computer Graphics and Applications.

[3]  Janice Redish,et al.  User and task analysis for interface design , 1998 .

[4]  Jean Scholtz,et al.  Visual Analytics Technology Transition Progress , 2009, Inf. Vis..

[5]  Ben Shneiderman,et al.  The eyes have it: a task by data type taxonomy for information visualizations , 1996, Proceedings 1996 IEEE Symposium on Visual Languages.

[6]  Jean Scholtz,et al.  VAST 2006 Contest - A Tale of Alderwood , 2006, 2006 IEEE Symposium On Visual Analytics Science And Technology.

[7]  M. Sheelagh T. Carpendale,et al.  Theoretical analysis of uncertainty visualizations , 2006, Electronic Imaging.

[8]  Paul A. Cairns,et al.  Beyond guidelines: what can we learn from the visual information seeking mantra? , 2005, Ninth International Conference on Information Visualisation (IV'05).

[9]  Janice Ginny Redish,et al.  Expanding usability testing to evaluate complex systems , 2007 .

[10]  Mica R. Endsley,et al.  Designing for Situation Awareness : An Approach to User-Centered Design , 2003 .

[11]  Jakob Nielsen,et al.  Usability inspection methods , 1994, CHI 95 Conference Companion.

[12]  Jean Scholtz,et al.  The Common Industry Format: A Way for Vendors and Customers to Talk About Software Usability , 2003, HCI International 2003.

[13]  Allison Woodruff,et al.  Guidelines for using multiple views in information visualization , 2000, AVI '00.

[14]  P. Pirolli,et al.  The Sensemaking Process and Leverage Points for Analyst Technology as Identified Through Cognitive Task Analysis , 2007 .

[15]  Ben Shneiderman,et al.  Strategies for evaluating information visualization tools: multi-dimensional in-depth long-term case studies , 2006, BELIV '06.

[16]  David F. Redmiles,et al.  Analyzing a socio-technical visualization tool using usability inspection methods , 2008, 2008 IEEE Symposium on Visual Languages and Human-Centric Computing.

[17]  Jean Scholtz,et al.  VAST 2009 challenge: An insider threat , 2009, 2009 IEEE Symposium on Visual Analytics Science and Technology.

[18]  Jean Scholtz,et al.  VAST 2010 Challenge: Arms dealings and pandemics , 2010, IEEE VAST.

[19]  Jean Scholtz,et al.  Quantifying Usability: The Industry Usability Reporting Project , 2002 .

[20]  Mica R. Endsley,et al.  Designing for Situation Awareness : An Approach to User-Centered Design , 2003 .

[21]  Catherine Plaisant,et al.  The challenge of information visualization evaluation , 2004, AVI.

[22]  William A. Pike,et al.  The Science of Analytic Reporting , 2009, Inf. Vis..

[23]  Ben Shneiderman,et al.  Designing The User Interface , 2013 .

[24]  Jean Scholtz,et al.  VAST 2007 Contest - Blue Iguanodon , 2007, 2007 IEEE Symposium on Visual Analytics Science and Technology.

[25]  Susan Weinschenk,et al.  Guidelines for Enterprise-Wide Gui Design , 1995 .

[26]  M. Sheelagh T. Carpendale,et al.  Heuristics for information visualization evaluation , 2006, BELIV '06.

[27]  John Stasko,et al.  BEST PAPER: A Knowledge Task-Based Framework for Design and Evaluation of Information Visualizations , 2004 .

[28]  Camilla Forsell,et al.  An heuristic set for evaluation in information visualization , 2010, AVI.

[29]  Deborah J. Mayhew,et al.  The usability engineering lifecycle , 1999, CHI Extended Abstracts.

[30]  Jean Scholtz,et al.  VAST 2008 Challenge: Introducing mini-challenges , 2008, 2008 IEEE Symposium on Visual Analytics Science and Technology.

[31]  Ben Shneiderman,et al.  Designing the user interface (videotape) , 1987 .

[32]  Deborah J. Mayhew,et al.  The usability engineering lifecycle , 1998, CHI Conference Summary.

[33]  Jakob Nielsen,et al.  Enhancing the explanatory power of usability heuristics , 1994, CHI '94.

[34]  Jakob Nielsen,et al.  Heuristic Evaluation of Prototypes (individual) , 2022 .

[35]  Jean Scholtz,et al.  Advancing User-Centered Evaluation of Visual Analytic Environments through Contests , 2009, Inf. Vis..