Using the user's point of view for interaction on mobile devices

We study interaction modalities for mobile devices (smartphones and tablets) that rely on a camera-based head tracking. This technique defines new possibilities for input and output interaction. For output, by computing the position of the device according to the user's head, it is for example possible to realistically control the viewpoint on a 3D scene (Head-Coupled Perspective, HCP). This technique improves the output interaction bandwidth by enhancing the depth perception and by allowing the visualization of large workspaces (virtual window). For input, head movement can be used as a means of interacting with a mobile device. Moreover such an input modality does not require any additional sensor except the built-in front-facing camera. In this paper, we classify the interaction possibilities offered by head tracking on smartphones and tablets. We then focus on the output interaction by introducing several applications of HCP on both smartphones and tablets and by presenting the results of a qualitative user experiment.

[1]  Patrick Baudisch,et al.  Halo: a technique for visualizing off-screen objects , 2003, CHI '03.

[2]  Carolina Cruz-Neira,et al.  Surround-Screen Projection-Based Virtual Reality: The Design and Implementation of the CAVE , 2023 .

[3]  François Bérard The Perceptual Window: Head Motion as a New Input Stream , 1999, INTERACT.

[4]  Trevor Darrell,et al.  Head gesture recognition in intelligent interfaces: the role of context in improving recognition , 2006, IUI '06.

[5]  Thierry Ganille,et al.  ICARE software components for rapidly developing multimodal interfaces , 2004, ICMI '04.

[6]  Kellogg S. Booth,et al.  Fish tank virtual reality , 1993, INTERCHI.

[7]  Michitaka Hirose,et al.  Development and evaluation of the CABIN immersive multiscreen display , 1999, Systems and Computers in Japan.

[8]  Jun Rekimoto,et al.  Tilting operations for small screen interfaces , 1996, UIST '96.

[9]  Jock D. Mackinlay,et al.  The perspective wall: detail and context smoothly integrated , 1991, CHI.

[10]  M. Hirose,et al.  Development and Evaluation of Immersive Multiscreen Display "CABIN" , 1999 .

[11]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[12]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[13]  Henry J. Gardner,et al.  Head Tracking in First-Person Games: Interaction Using a Web-Camera , 2009, INTERACT.

[14]  John Riedl,et al.  An operator interaction framework for visualization systems , 1998, Proceedings IEEE Symposium on Information Visualization (Cat. No.98TB100258).

[15]  Joëlle Coutaz,et al.  A generic platform for addressing the multimodal challenge , 1995, CHI '95.

[16]  Gregory D. Abowd,et al.  Perceptual user interfaces using vision-based eye tracking , 2003, ICMI '03.

[17]  Kellogg S. Booth,et al.  Evaluating 3D task performance for fish tank virtual worlds , 1993, TOIS.

[18]  Niklas Elmqvist,et al.  Evaluating motion constraints for 3D wayfinding in immersive and desktop virtual environments , 2008, CHI.

[19]  Doug A. Bowman,et al.  Body-based interaction for desktop games , 2009, CHI Extended Abstracts.

[20]  Jun Rekimoto A vision-based head tracker for fish tank virtual reality-VR without head gear , 1995, Proceedings Virtual Reality Annual International Symposium '95.

[21]  Stephen A. Brewster,et al.  Head tilting for interaction in mobile contexts , 2009, Mobile HCI.

[22]  Niels Ole Bernsen,et al.  MODALITY THEORY IN SUPPORT OF MULTIMODAL INTERFACE DESIGN , 1994, AAAI 1994.

[23]  Jose L. Contreras-Vidal,et al.  Understanding One-Handed Use of Mobile Devices , 2008 .

[24]  Johnny Chung Lee,et al.  Hacking the Nintendo Wii Remote , 2008, IEEE Pervasive Computing.

[25]  T. C. Nicholas Graham,et al.  Experience in the design and development of a game based on head-tracking input , 2008, Future Play.

[26]  Ben Shneiderman,et al.  The eyes have it: a task by data type taxonomy for information visualizations , 1996, Proceedings 1996 IEEE Symposium on Visual Languages.

[27]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[28]  Tolga K. Çapin,et al.  A Face Tracking Algorithm for User Interaction in Mobile Devices , 2009, 2009 International Conference on CyberWorlds.

[29]  Sidney S. Fels,et al.  pCubee: a perspective-corrected handheld cubic display , 2010, CHI.

[30]  Janne Heikkilä,et al.  Face Tracking for Spatially Aware Mobile User Interfaces , 2008, ICISP.

[31]  Stephen A. Brewster,et al.  Multimodal 'eyes-free' interaction techniques for wearable devices , 2003, CHI '03.