Air+touch: interweaving touch & in-air gestures

We present Air+Touch, a new class of interactions that interweave touch events with in-air gestures, offering a unified input modality with expressiveness greater than each input modality alone. We demonstrate how air and touch are highly complementary: touch is used to designate targets and segment in-air gestures, while in-air gestures add expressivity to touch events. For example, a user can draw a circle in the air and tap to trigger a context menu, do a finger 'high jump' between two touches to select a region of text, or drag and in-air 'pigtail' to copy text to the clipboard. Through an observational study, we devised a basic taxonomy of Air+Touch interactions, based on whether the in-air component occurs before, between or after touches. To illustrate the potential of our approach, we built four applications that showcase seven exemplar Air+Touch interactions we created.

[1]  Chris Harrison,et al.  Whack gestures: inexact and inattentive interaction with mobile devices , 2010, TEI '10.

[2]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[3]  Patrick Baudisch,et al.  Hover widgets: using the tracking state to extend the capabilities of pen-operated devices , 2006, CHI.

[4]  Hamed Ketabdar,et al.  MagiTact: interaction with mobile devices based on compass (magnetic) sensor , 2010, IUI '10.

[5]  Andy Cockburn,et al.  Zoofing!: faster list selections with pressure-zoom-flick-scrolling , 2009, OZCHI '09.

[6]  Darren Leigh,et al.  Under the table interaction , 2006, UIST.

[7]  Saul Greenberg,et al.  The Continuous Interaction Space: Interaction Techniques Unifying Touch and Gesture on and above a Digital Surface , 2011, INTERACT.

[8]  Shahram Izadi,et al.  SideSight: multi-"touch" interaction around small devices , 2008, UIST '08.

[9]  Ken Hinckley,et al.  Sensor synaesthesia: touch in motion, and motion in touch , 2011, CHI.

[10]  Chris Harrison,et al.  Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices , 2009, UIST '09.

[11]  David A. Forsyth,et al.  Around device interaction for multiscale navigation , 2012, Mobile HCI.

[12]  Xiang 'Anthony' Chen,et al.  Motion and context sensing techniques for pen computing , 2013, Graphics Interface.

[13]  Patrick Baudisch,et al.  Back-of-device interaction allows creating very small touch devices , 2009, CHI.

[14]  Roel Vertegaal,et al.  Pointable: an in-air pointing technique to manipulate out-of-reach targets on tabletops , 2011, ITS '11.

[15]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[16]  Masatoshi Ishikawa,et al.  In-air typing interface for mobile devices with vibration feedback , 2010, SIGGRAPH '10.

[17]  Michael Rohs,et al.  PalmSpace: continuous around-device gestures vs. multitouch for 3D rotation tasks on mobile devices , 2012, AVI.

[18]  Patrick Baudisch,et al.  Design and analysis of delimiters for selection-action pen gesture phrases in scriboli , 2005, CHI.

[19]  Ravin Balakrishnan,et al.  Zliding: fluid zooming and sliding for high precision parameter manipulation , 2005, UIST.

[20]  Michael Rohs,et al.  HoverFlow: expanding the design space of around-device interaction , 2009, Mobile HCI.

[21]  William Buxton,et al.  Pen + touch = new tools , 2010, UIST.

[22]  Patrick Baudisch,et al.  Lucid touch: a see-through mobile device , 2007, UIST.

[23]  Dimitre Novatchev,et al.  Chunking and Phrasing and the Design of Human-Computer Dialogues - Response , 1986, IFIP Congress.