A Device-Independent 3D User Interface for Mobile Phones Based on Motion and Tracking Techniques

The user interaction in mobile phones has been traditionally based on key behaviors that users have adapted around their mobile devices. Although this type of interaction could be suitable for some contents, it could be improved from the point of view of the usability. In this paper we present a new user interface based on 3D navigation oriented to mobile phones. To enhance the user experience, we have developed a new interface based on 3D real-time graphics where users interact by means of natural movements of their devices. The goals of this development consist in enhancing the user interaction and accessibility to web content or interactive multimedia applications by means of avoiding a key-based, or a mouse-based, navigation and proposing a software solution adaptable to multiple and different mobile devices. In this sense, the user inputs can be detected by different input devices such as accelerometers or cameras, as well as the traditional keypads. Since mobile phones with on-board digital cameras are now widely available at low cost, the proposed 3D user interface exploits the acquisition capabilities of these input devices. In this sense, a differential algorithm has been applied in order to estimate phone movements from video images. The results of the performance evaluation of the 3D user interface shows that the proposed algorithm not only obtains a motion and tracking under extreme lighting conditions, but also adds an insignificant overhead to the system performance. Finally, a 3D environment has been designed to evaluate the performance of the presented approach, which has been successfully tested in actual users.

[1]  John F. Canny,et al.  TinyMotion: camera phone based interaction methods , 2006, CHI EA '06.

[2]  Janne Heikkilä,et al.  Vision-based motion estimation for interaction with mobile devices , 2007, Comput. Vis. Image Underst..

[3]  Grantham K. H. Pang,et al.  Accelerometer for mobile robot positioning , 1999, Conference Record of the 1999 IEEE Industry Applications Conference. Thirty-Forth IAS Annual Meeting (Cat. No.99CH36370).

[4]  Dieter Schmalstieg,et al.  Pose tracking from natural features on mobile phones , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[5]  Shumin Zhai,et al.  Camera phone based motion sensing: interaction techniques, applications and performance study , 2006, UIST.

[6]  Charles Woodward,et al.  SymBall: camera driven table tennis for mobile phones , 2005, ACE '05.

[7]  P. Morillo,et al.  SUED : An Extensible Framework for the Development of Low-cost DVE Systems , 2009 .

[8]  Jane Hwang,et al.  Camera based Relative Motion Tracking for Hand-held Virtual Reality , 2006 .

[9]  Azriel Rosenfeld,et al.  Computer Vision , 1988, Adv. Comput..

[10]  Jun Rekimoto,et al.  CyberCode: designing augmented reality environments with visual tags , 2000, DARE '00.

[11]  Linda G. Shapiro,et al.  Computer Vision , 2001 .

[12]  Dieter Schmalstieg,et al.  Robust and unobtrusive marker tracking on mobile phones , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[13]  Oliver Bimber,et al.  Video see-through AR on consumer cell-phones , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[14]  Michael Rohs,et al.  USING CAMERA-EQUIPPED MOBILE PHONES FOR INTERACTING WITH REAL-WORLD OBJECTS , 2004 .

[15]  Steven S. Beauchemin,et al.  The computation of optical flow , 1995, CSUR.

[16]  Tolga K. Çapin,et al.  Mobile Camera-Based User Interaction , 2005, ICCV-HCI.

[17]  Vidya Setlur,et al.  Camera-Based Virtual Environment Interaction on Mobile Devices , 2006, ISCIS.

[18]  Eric Horvitz,et al.  Sensing techniques for mobile interaction , 2000, UIST '00.

[19]  Daniel J. Wigdor,et al.  TiltText: using tilt for text input to mobile phones , 2003, UIST '03.