Camera phone based motion sensing: interaction techniques, applications and performance study

This paper presents TinyMotion, a pure software approach for detecting a mobile phone user's hand movement in real time by analyzing image sequences captured by the built-in camera. We present the design and implementation of TinyMotion and several interactive applications based on TinyMotion. Through both an informal evaluation and a formal 17-participant user study, we found that 1. TinyMotion can detect camera movement reliably under most background and illumination conditions. 2. Target acquisition tasks based on TinyMotion follow Fitts' law and Fitts law parameters can be used for TinyMotion based pointing performance measurement. 3. The users can use Vision TiltText, a TinyMotion enabled input method, to enter sentences faster than MultiTap with a few minutes of practicing. 4. Using camera phone as a handwriting capture device and performing large vocabulary, multilingual real time handwriting recognition on the cell phone are feasible. 5. TinyMotion based gaming is enjoyable and immediately available for the current generation camera phones. We also report user experiences and problems with TinyMotion based interaction as resources for future design and development of mobile interfaces.

[1]  Thomas B. Moeslund,et al.  A Survey of Computer Vision-Based Human Motion Capture , 2001, Comput. Vis. Image Underst..

[2]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[3]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[4]  Oliver Bimber,et al.  Video see-through AR on consumer cell-phones , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[5]  Nicole M. Artner,et al.  Motion Detection as Interaction Technique for Games & Applications on Mobile Devices , 2005, PERMID.

[6]  Peter Kuhn,et al.  Algorithms, Complexity Analysis and VLSI Architectures for MPEG-4 Motion Estimation , 1999, Springer US.

[7]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[8]  Vibha Sazawal,et al.  TiltType: accelerometer-supported text entry for very small devices , 2002, UIST '02.

[9]  Tieniu Tan,et al.  Recent developments in human motion analysis , 2003, Pattern Recognit..

[10]  Jean Ponce,et al.  Computer Vision: A Modern Approach , 2002 .

[11]  Michael Rohs,et al.  Sweep and point and shoot: phonecam-based interactions for large public displays , 2005, CHI Extended Abstracts.

[12]  Roy Want,et al.  Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces , 1998, CHI.

[13]  Michael Rohs,et al.  A Conceptual Framework for Camera Phone-Based Interaction Techniques , 2005, Pervasive.

[14]  Jun Rekimoto,et al.  Tilting operations for small screen interfaces , 1996, UIST '96.

[15]  Paul A. Beardsley,et al.  Computer Vision for Interactive Computer Graphics , 1998, IEEE Computer Graphics and Applications.

[16]  Eva Eriksson,et al.  Mixed interaction space: designing for camera based interaction with mobile devices , 2005, CHI Extended Abstracts.

[17]  Daniel J. Wigdor,et al.  TiltText: using tilt for text input to mobile phones , 2003, UIST '03.

[18]  Jun Rekimoto,et al.  CyberCode: designing augmented reality environments with visual tags , 2000, DARE '00.

[19]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[20]  Janne Heikkilä,et al.  A Vision-Based Approach for Controlling User Interfaces of Mobile Devices , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.

[21]  Borko Furht,et al.  Motion estimation algorithms for video compression , 1996 .

[22]  Shumin Zhai,et al.  Virtual reality for palmtop computers , 1993, TOIS.

[23]  William Buxton,et al.  User learning and performance with marking menus , 1994, CHI '94.