Mobiles as Portals for Interacting with Virtual Data Visualizations

We propose a set of techniques leveraging mobile devices as lenses to explore, interact and annotate n-dimensional data visualizations. The democratization of mobile devices, with their arrays of integrated sensors, opens up opportunities to create experiences for anyone to explore and interact with large information spaces anywhere. In this paper, we propose to revisit ideas behind the Chameleon prototype of Fitzmaurice et al. initially envisioned in the 90s for navigation, before spatially-aware devices became mainstream. We also take advantage of other input modalities such as pen and touch to not only navigate the space using the mobile as a lens, but interact and annotate it by adding toolglasses.

[1]  Raimund Dachselt,et al.  Use your head: tangible windows for 3D information spaces in a tabletop environment , 2012, ITS.

[2]  Shumin Zhai,et al.  Virtual reality for palmtop computers , 1993, TOIS.

[3]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[4]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[5]  Dieter Schmalstieg,et al.  GlassHands: Interaction Around Unmodified Mobile Devices Using Sunglasses , 2016, ISS.

[6]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[7]  Heidrun Schumann,et al.  Tangible views for information visualization , 2010, ITS '10.

[8]  Heidrun Schumann,et al.  Interactive Lenses for Visualization: An Extended Survey , 2017, Comput. Graph. Forum.

[9]  Tobias Isenberg,et al.  Lightweight Relief Shearing for Enhanced Terrain Perception on Interactive Maps , 2015, CHI.

[10]  Yvonne Rogers,et al.  HuddleLamp: Spatially-Aware Mobile Displays for Ad-hoc Around-the-Table Collaboration , 2014, ITS '14.

[11]  Raimund Dachselt,et al.  PaperLens: advanced magic lens interaction above the tabletop , 2009, ITS '09.

[12]  William Buxton,et al.  Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display , 2002, UIST '02.

[13]  Pourang Irani,et al.  SAMMI: A Spatially-Aware Multi-Mobile Interface for Analytic Map Navigation Tasks , 2015, MobileHCI.

[14]  Jin Sung Choi,et al.  Freeze-Set-Go interaction method for handheld mobile augmented reality environments , 2009, VRST '09.

[15]  Jens Grubert,et al.  HeadPhones: Ad Hoc Mobile Multi-Display Environments through Head Tracking , 2017, CHI.

[16]  Yvonne Rogers,et al.  Demonstrating HuddleLamp: Spatially-Aware Mobile Displays for Ad-hoc Around-the-Table Collaboration , 2014, ITS '14.

[17]  Dieter Schmalstieg,et al.  The utility of Magic Lens interfaces on handheld devices for touristic map navigation , 2015, Pervasive Mob. Comput..

[18]  Abigail Sellen,et al.  Toward compound navigation tasks on mobiles via spatial manipulation , 2013, MobileHCI '13.

[19]  Benjamin B. Bederson,et al.  A review of overview+detail, zooming, and focus+context interfaces , 2009, CSUR.

[20]  Dieter Schmalstieg,et al.  MultiFi: Multi Fidelity Interaction with Displays On and Around the Body , 2015, CHI.

[21]  Ricardo Langner,et al.  Investigating the Use of Spatial Interaction for 3D Data Visualization on Mobile Devices , 2017, ISS.

[22]  Johanna Beyer,et al.  The Hologram in My Hand: How Effective is Interactive Exploration of 3D Visualizations in Immersive Tangible Augmented Reality? , 2018, IEEE Transactions on Visualization and Computer Graphics.

[23]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[24]  M. Sheelagh T. Carpendale,et al.  A Descriptive Framework for Temporal Data Visualizations Based on Generalized Space‐Time Cubes , 2017, Comput. Graph. Forum.