TouchPosing: multi-modal interaction with geospatial data

Multi-touch interaction offers opportunities to interact with complex data. Especially the exploration of geographical data, which until today mostly relies on mice and keyboard input, could benefit from this interaction paradigm. However, the gestures that are required to interact with complex systems like Geographic Information Systems (GIS) increase in difficulty with every additional functionality. This paper describes a novel interaction approach that allows non-expert users to easily explore geographic data using a combination of multi-touch gestures and handpostures. The use of the additional input modality -- handpose -- is supposed to avoid more complex multi-touch gestures. Furthermore the screen of a wearable device serves as another output modality that on one hand avoids occlusion and on the other hand serves as a magic lens.

[1]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[2]  William Buxton,et al.  ThinSight: integrated optical multi-touch sensing through thin form-factor displays , 2007, EDT '07.

[3]  Dominik Schmidt,et al.  PhoneTouch: a technique for direct phone interaction on surfaces , 2010, UIST '10.

[4]  Johannes Schöning,et al.  Improving interaction with virtual globes through spatial thinking: helping users ask "why?" , 2008, IUI '08.

[5]  Enrico Costanza,et al.  TUIO: A Protocol for Table-Top Tangible User Interfaces , 2005 .

[6]  Raimund Dachselt,et al.  PaperLens: advanced magic lens interaction above the tabletop , 2009, ITS '09.

[7]  Bill Buxton,et al.  Multi-Touch Systems that I Have Known and Loved , 2009 .

[8]  Thomas Kirste,et al.  Utilizing an Accelerometric Bracelet for Ubiquitous Gesture-Based Interaction , 2009, HCI.

[9]  Meredith Ringel Morris,et al.  Experiences with and observations of direct-touch tabletops , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[10]  Otmar Hilliges,et al.  Bringing physics to the surface , 2008, UIST '08.

[11]  Eva Hornecker,et al.  “I don’t understand it either, but it is cool” - visitor interactions with a multi-touch table in a museum , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[12]  Johannes Schöning,et al.  PINwI: pedestrian indoor navigation without infrastructure , 2010, NordiCHI.

[13]  Mike Wu,et al.  Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[14]  Jun Rekimoto,et al.  GestureWrist and GesturePad: unobtrusive wearable interaction devices , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[15]  Johannes Schöning,et al.  Whole Body Interaction with Geospatial Data , 2009, Smart Graphics.

[16]  Clifton Forlines,et al.  DTLens: multi-user tabletop spatial data exploration , 2005, UIST.

[17]  Johannes Schöning,et al.  Map navigation with mobile devices: virtual versus physical movement with and without visual context , 2007, ICMI '07.