Toward Mixed Method Evaluations of Scientific Visualizations and Design Process as an Evaluation Tool
Toward Mixed Method Evaluations of Scientific Visualizations and Design Process as an Evaluation Tool. Bret Jackson, Dane Coffey, Lauren Thorson, David Schroeder, Arin Ellingson, David Nuckley, Daniel F. Keefe. BELIV 2012: Beyond Time and Errors: Novel Evaluation Methods for Visualization, Workshop at IEEE VisWeek 2012 (2012) pp. 1–6
In this position paper we discuss successes and limitations of current evaluation strategies for scientific visualizations and argue for embracing a mixed methods strategy of evalu- ation. The most novel contribution of the approach that we advocate is a new emphasis on employing design processes as practiced in related fields (e.g., graphic design, illustra- tion, architecture) as a formalized mode of evaluation for data visualizations. To motivate this position we describe a series of recent evaluations of scientific visualization in- terfaces and computer graphics strategies conducted within our research group. Complementing these more traditional evaluations our visualization research group also regularly employs sketching, critique, and other design methods that have been formalized over years of practice in design fields. Our experience has convinced us that these activities are invaluable, often providing much more detailed evaluative feedback about our visualization systems than that obtained via more traditional user studies and the like. We believe that if design-based evaluation methodologies (e.g., ideation, sketching, critique) can be taught and embraced within the visualization community then these may become one of the most effective future strategies for both formative and sum- mative evaluations.
This publication is a part of the following research projects: