Scaling Up Multi-Touch Selection and Querying: Interfaces and Applications for Combining Mobile Multi-Touch Input with Large-Scale Visualization Displays
Scaling Up Multi-Touch Selection and Querying: Interfaces and Applications for Combining Mobile Multi-Touch Input with Large-Scale Visualization Displays. Daniel F. Keefe, Ankit Gupta, Daniel Feldman, John V. Carlis, Susi Krehbiel Keefe, Tim Griffin. International Journal of Human-Computer Studies (2012) Volume 70, Number 10 pp. 703–713
We present a mobile multi-touch interface for selecting, querying, and visually exploring data visualized on large, high-resolution displays. Although emerging large (e.g., ~10 meters wide), high-resolution displays provide great potential for visualizing dense, complex datasets, their utility is often limited by a fundamental interaction problem – the need to interact with data from multiple positions around a large room. Our solution is a selection and querying interface that combines a hand-held multi-touch device with 6 degree-of-freedom tracking in the physical space that surrounds the large display. The interface leverages context from both the user’s physical position in the room and the current data being visualized in order to interpret multi-touch gestures. It also utilizes progressive refinement, favoring several quick approximate gestures as opposed to a single complex input in order to most effectively map the small mobile multi-touch input space to the large display wall. The approach is evaluated through two interdisciplinary visualization applications: a multi-variate data isualization for social scientists, and a visual database querying tool for biochemistry. The interface was effective in both scenarios, leading to new domain-specific insights and suggesting valuable guidance for future developers.
This publication is a part of the following research projects: