Gesture-aware remote controls: guidelines and interaction technique

Interaction with TV sets, set-top boxes or media centers strongly differs from interaction with personal computers: not only does a typical remote control suffer strong form factor limitations but the user may well be slouching in a sofa. In the face of more and more data, features, and services made available on interactive televisions, we propose to exploit the new capabilities provided by gesture-aware remote controls. We report the data of three user studies that suggest some guidelines for the design of a gestural vocabulary and we propose five novel interaction techniques. Study 1 reports that users spontaneously perform pitch and yaw gestures as the first modality when interacting with a remote control. Study 2 indicates that users can accurately select up to 5 items with eyes-free roll gestures. Capitalizing on our findings, we designed five interaction techniques that use either device motion, or button-based interaction, or both. They all favor the transition from novice to expert usage for selecting favorites. Study 3 experimentally compares these techniques. It reveals that motion of the device in 3D space, associated with finger presses at the surface of the device, is achievable, fast and accurate. Finally, we discuss the integration of these techniques into a coherent multimedia menu system.

[1]  Michel Beaudouin-Lafon,et al.  Charade: remote control of objects using free-hand gestures , 1993, CACM.

[2]  Steven A. Shafer,et al.  XWand: UI for intelligent spaces , 2003, CHI '03.

[3]  Ravin Balakrishnan,et al.  Simple vs. compound mark hierarchical marking menus , 2004, UIST '04.

[4]  William Buxton,et al.  The limits of expert performance using hierarchic marking menus , 1993, INTERCHI.

[5]  Sriram Subramanian,et al.  GesText: accelerometer-based gestural text-entry systems , 2010, CHI.

[6]  Kellogg S. Booth,et al.  Mid-air text input techniques for very large wall displays , 2009, Graphics Interface.

[7]  Donald A. Norman,et al.  Natural user interfaces are not natural , 2010, INTR.

[8]  Xiang Cao,et al.  VisionWand: interaction techniques for large displays using a passive wand tracked in 3D , 2003, UIST '03.

[9]  Pablo César,et al.  The Evolution of TV Systems, Content, and Users Toward Interactivity , 2009, Found. Trends Hum. Comput. Interact..

[10]  Barry A. T. Brown,et al.  Unpacking the television: User practices around a changing technology , 2009, TCHI.

[11]  Stephen A. Brewster,et al.  Usable gestures for mobile interfaces: evaluating social acceptability , 2010, CHI.

[12]  Gilles Bailly,et al.  Flower menus: a new type of marking menu with large menu breadth, within groups and efficient expert mode memorization , 2008, AVI '08.

[13]  George W. Fitzmaurice,et al.  The Rockin'Mouse: integral 3D manipulation on a plane , 1997, CHI.

[14]  Ian Oakley,et al.  Motion marking menus: An eyes-free approach to motion input for handheld devices , 2009, Int. J. Hum. Comput. Stud..

[15]  Jing Yang,et al.  Magic wand: a hand-drawn gesture input device in 3-D space with inertial sensors , 2004, Ninth International Workshop on Frontiers in Handwriting Recognition.

[16]  Sriram Subramanian,et al.  Tilt techniques: investigating the dexterity of wrist-based input , 2009, CHI.

[17]  Abigail Sellen,et al.  An Empirical Evaluation of Some Articulatory and Cognitive Aspects of Marking Menus , 1993, Hum. Comput. Interact..

[18]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[19]  Daniel J. Wigdor,et al.  TiltText: using tilt for text input to mobile phones , 2003, UIST '03.

[20]  Shumin Zhai,et al.  SHRIMP: solving collision and out of vocabulary problems in mobile predictive input with motion gesture , 2010, CHI.

[21]  Vibha Sazawal,et al.  The Unigesture Approach , 2002, Mobile HCI.

[22]  William Cooper The interactive television user experience so far , 2008, UXTV '08.

[23]  Ian Oakley,et al.  Designing Eyes-Free Interaction , 2007, HAID.

[24]  William Buxton,et al.  Issues in combining marking and direct manipulation techniques , 1991, UIST '91.

[25]  James David,et al.  Remote Control , 2008 .

[26]  Sebastiano Impedovo Frontiers in Handwriting Recognition , 1994 .

[27]  William T. Freeman,et al.  Television control by hand gestures , 1994 .

[28]  Vibha Sazawal,et al.  TiltType: accelerometer-supported text entry for very small devices , 2002, UIST '02.

[29]  John I. Kiger,et al.  The Depth/Breadth Trade-Off in the Design of Menu-Driven User Interfaces , 1984, Int. J. Man Mach. Stud..

[30]  Jani Mäntyjärvi,et al.  Accelerometer-based gesture control for a design environment , 2006, Personal and Ubiquitous Computing.

[31]  Neville Hogan,et al.  The mechanics of multi-joint posture and movement control , 1985, Biological Cybernetics.