Toward Multimodal Interaction of Scatterplot Spaces Exploration

Lin Shao Graz University of Technology Graz, Austria l.shao@cgv.tugraz.at Tobias Schreck Graz University of Technology Graz, Austria t.schreck@cgv.tugraz.at Abstract The latest generation of large vertically-mounted multitouch displays bring new opportunities for solving visual analytics tasks. Due to their size, it is possible to visualise and collaboratively interact with high-dimensional datasets and multiple views (e.g., scatterplots, scatterplot matrices and parallel coordinates). However, using only multi-touch for input can be overly restrictive. Other modalities need to be considered to utilise the power of these screens fully. By adding natural language interaction, the user can directly interact with the visual analytics application from a distance. Incorporating eye-tracking can help narrow down what the user is looking at or is interested in. In this paper, some of the challenges of using multi-touch as input for the analysis of scatterplot spaces on large vertically-mounted multitouch displays are discussed and addressed by proposing the incorporation of other interaction modalities.

[1]  Jian Wang,et al.  Integration of eye-gaze, voice and manual response in multimodal user interface , 1995, 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century.

[2]  John T. Stasko,et al.  Orko: Facilitating Multimodal Interaction for Visual Exploration and Analysis of Networks , 2018, IEEE Transactions on Visualization and Computer Graphics.

[3]  Tobias Schreck,et al.  Visual Exploration of Large Scatter Plot Matrices by Pattern Recommendation based on Eye Tracking , 2017, ESIDA@IUI.

[4]  Tobias Schreck,et al.  Interactive Regression Lens for Exploring Scatter Plots , 2017, Comput. Graph. Forum.