Hotspotizer: end-user authoring of mid-air gestural interactions

Drawing from a user-centered design process and guidelines derived from the literature, we developed a paradigm based on space discretization for declaratively authoring mid-air gestures and implemented it in Hotspotizer, an end-to-end toolkit for mapping custom gestures to keyboard commands. Our implementation empowers diverse user populations -- including end-users without domain expertise -- to develop custom gestural interfaces within minutes, for use with arbitrary applications.

[1]  Scott R. Klemmer,et al.  Authoring sensor-based interactions by demonstration with direct manipulation and pattern recognition , 2007, CHI.

[2]  Mary Shaw,et al.  The state of the art in end-user software engineering , 2011, ACM Comput. Surv..

[3]  Dan R. Olsen,et al.  Evaluating user interface systems research , 2007, UIST.

[4]  Andrew D. Wilson Sensor- and Recognition-Based Input for Interaction , 2009 .

[5]  Yang Li,et al.  Gesture studio: authoring multi-touch interactions through demonstration and declaration , 2013, CHI.

[6]  Galen Panger Kinect in the kitchen: testing depth camera interactions in practical home environments , 2012, CHI EA '12.

[7]  HENRY LIEBERMAN,et al.  End-User Development: An Emerging Paradigm , 2006, End User Development.

[8]  Yoshifumi Kitamura,et al.  Body-centric interaction techniques for very large wall displays , 2010, NordiCHI.

[9]  Burkhard Wünsche,et al.  Using the Kinect as a navigation sensor for mobile robotics , 2012, IVCNZ '12.

[10]  Donald A. Norman,et al.  Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine , 1993 .

[11]  Beat Signer,et al.  Criteria, Challenges and Opportunities for Gesture Programming Languages , 2014, EGMI@EICS.

[12]  Reza Lotfian,et al.  Immersive multiplayer tennis with microsoft kinect and body sensor networks , 2012, ACM Multimedia.

[13]  Nadir Weibel,et al.  MotionDraw: a tool for enhancing art and performance using kinect , 2013, CHI Extended Abstracts.

[14]  Takeo Igarashi,et al.  CUBOD: a customized body gesture design tool for end users , 2013, BCS HCI.

[15]  Tony DeRose,et al.  Proton++: a customizable declarative multitouch framework , 2012, UIST.

[16]  Beat Signer,et al.  Declarative Gesture Spotting using Inferred and Refined Control Points , 2013, ICPRAM.

[17]  Yücel Yemez,et al.  User Interface Paradigms for Visually Authoring Mid-Air Gestures: A Survey and a Provocation , 2014, EGMI@EICS.

[18]  Thad Starner,et al.  MAGIC: a motion gesture design tool , 2010, CHI.

[19]  Tek-Jin Nam,et al.  EventHurdle: supporting designers' exploratory interaction prototyping with gesture-based sensors , 2013, CHI.

[20]  Tony DeRose,et al.  Proton: multitouch gestures as regular expressions , 2012, CHI.

[21]  Norman H. Villaroman,et al.  Teaching natural user interaction using OpenNI and the Microsoft Kinect sensor , 2011, SIGITE '11.

[22]  Albert A. Rizzo,et al.  Adapting user interfaces for gestural interaction with the flexible action and articulated skeleton toolkit , 2013, Comput. Graph..

[23]  James A. Landay,et al.  "Those look similar!" issues in automating gesture design advice , 2001, PUI '01.

[24]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[25]  James D. Hollan,et al.  Direct Manipulation Interfaces , 1985, Hum. Comput. Interact..

[26]  A.,et al.  Cognitive Engineering , 2008, Encyclopedia of GIS.

[27]  Zoe Marquardt,et al.  Super Mirror: a kinect interface for ballet dancers , 2012, CHI EA '12.