Information Visualization Evaluation

Evaluations of Information Visualization is one of the hottest topic of the field. Several visualization tools have been developed recently, but little effort has been made to evaluate the effectiveness and utility of the tools.


Evaluation criteria

Some criteria that can be addressed in a evaluation process are the following:

  • Functionality – to what extend the system provides the functionalities required by the users?
  • Effectiveness – do the visualization provide value? Do they provide new insight? How? Why?
  • Efficiency – to what extend the visualization may help the users in achieve a better performance?
  • Usability – how easily the users interact with the system? Are the information provided in clear and understandable format?
  • Usefulness – are the visualization useful? How may benefit from it?


It is possible to classify visualization techniques into two main categories Analytic and Empiric evaluations:

  • Analytic evaluation, based on formal analysis models and conducted by experts; analytic evaluations can be further divided into the following:
    • Heuristic/expert evaluation, where experts act as less experienced users and describe the potential problems they foresee for such users;
    • Cognitive walkthroughs, where experts walks through a specific task using a prototype
  • Empirical evaluation, realized through experiments with user test; they can be further distinguished between quantitative studies and qualitative studies.
    • Quantitative studies, consist of an analysis of determinate hypotheses tested through direct measurements.
      • Controlled experiments, (or experimental studies), where an evaluator can manipulate a number of factors associated with interface design and study their effects on various aspects of user performance.
    • Qualitative studies, involves the analysis of qualitative data, which may be obtained through questionnaires, interviews and observing users using the system, to understand and explain social phenomena;
      • Focus group, group interviewing of individuals selected and assembled by researchers to discuss and comment on, from personal experience, the topic that is the subject of the research
      • interviews, can be conducted with users, asking specific questions to elicit information about users’ impression and general comments.

Some metrics

Classic methods of HCI are not appropriate for assessing InfoVis systems. Some HCI metrics used are:

  • time required to learn the system
  • time required to achieve a goal
  • error rates
  • retention of the use the interface over the time

Conferences and pubblications

  • International Journal of Human-Computer Studies, special issue on Empirical evaluation of information visualizations, Vol. 53, Issue5.
  • Proceedings of the 2006 AVI workshop on BEyond time and errors: novel evaluation methods for information visualization 2006, Venice, Italy May 23 – 23, 2006
  • BELIV’08: Beyond time and errors: novel evaluation methods for information visualization. A Workshop of the ACM CHI 2008 Conference, April 5, 2008 – Florence, Italy.


In this section I collect a list of the paper focusing on this issue:

  • Chen C. Top 10 Unsolved Information Visualization Problems. IEEE Computer Graphics and Applications. 25 (4). pp.12-16. 2005.
  • S.T. Barlow, P. Neville. A Comparison of 2-D Visualizations of Hierarchies. Proc. of IEEE Symposium on Information Visualization, 2001.
  • P. Saraiya, C. North, K. Duca. An Evaluation of Microarray Visualization Tools for Biological Insight. Proc. of IEEE Symposium on Information Visualization 2004 (InfoVis 2004).
  • Marc M. Sebrechts, Joanna Vasilakis, Michael S. Miller, John V. Cugini, Sharon J. Laskowski. Visualization of Search Results: A Comparative Evaluation of Text, 2D, and 3D Interfaces.
  • Melanie Tory and Torsten Möller, “Evaluating Visualizations: Do Expert Reviews Work?”, IEEE Computer Graphics and Applications, vol. 25, no. 5, Sept./Oct. 2005, pp.8-11.
  • DOUG SCHAFFER et al. Navigating Hierarchically Clustered Networks through Fisheye and Full-Zoom Methods. ACM Transactions on Computer-Human Interaction, Vol. 3, No. 2, June 1996, Pages 162–188.
  • Kamran Sedig et al. Application of information visualization techniques to the design of a mathematical mindtool: a usability study. Information Visualization (2003) 2, 142–159
  • (on the usability of SoftVis) Marcus, A.; Comorski, D.; Sergeyev, A. Supporting the evolution of a software visualization tool through usability studies. Program Comprehension, 2005. IWPC 2005. Proceedings. 13th International Workshop on 15-16 May 2005 Page(s):307 – 316


  • Keith Andrews. Specialist in HCI and InfoVis evaluation.
  • Melanie Tory. Evaluation of interaction techniques for InfoVis.

Evaluation of Visualizations for Information retrieval

Leave a Reply

Your email address will not be published. Required fields are marked *