Interactive augmented reality

Augmented reality can provide a new experience to users by adding virtual objects where they are relevant in the real world. The new generation of mobile phones o ers a platform to develop augmented reality application for industry as well as for the general public. Although some applications are reaching commercial viability, the technology is still limited. The main problem designers have to face when building an augmented reality application is to implement an interaction method. Interacting through the mobile's keyboard can prevent the user from looking on the screen. Normally, mobile devices have small keyboards, which are di cult to use without looking at them. Displaying a virtual keyboard on the screen is not a good solution either as the small screen is used to display the augmented real world. This thesis proposes a gesture-based interaction approach for this kind of applications. The idea is that by holding and moving the mobile phone in di erent ways, users are able to interact with virtual content. This approach combines the use of input devices as keyboards or joysticks and the detection of gestures performed with the body into one scenario: the detection of the phone's movements performed by users. Based on an investigation of people's own preferred gestures, a repertoire of manipulations was de ned and used to implement a demonstrator application running on a mobile phone. This demo was tested to evaluate the gesture-based interaction within an augmented reality application. The experiment shows that it is possible to implement and use gesturebased interaction in augmented reality. Gestures can be designed to solve the limitations of augmented reality and o er a natural and easy to learn interaction to the user.

[1]  T. M. Sezgin,et al.  MOBILE-PHONE BASED GESTURE RECOGNITION , 2007 .

[2]  Blair MacIntyre,et al.  BragFish: exploring physical and social interaction in co-located handheld augmented reality games , 2008, ACE '08.

[3]  Zoltán Prekopcsák,et al.  Accelerometer Based Real-Time Gesture Recognition , 2008 .

[4]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[5]  Peter Ljungstrand,et al.  Boxed Pervasive Games: An Experience with User-Created Pervasive Games , 2009, Pervasive.

[6]  Michael Rohs,et al.  Real-World Interaction with Camera Phones , 2004, UCS.

[7]  Shumin Zhai,et al.  Camera phone based motion sensing: interaction techniques, applications and performance study , 2006, UIST.

[8]  Steve Benford,et al.  Designing the spectator experience , 2005, CHI.

[9]  Mubarak Shah,et al.  Recognizing Hand Gestures , 1994, ECCV.

[10]  Michael Rohs,et al.  USING CAMERA-EQUIPPED MOBILE PHONES FOR INTERACTING WITH REAL-WORLD OBJECTS , 2004 .

[11]  Michael Rohs,et al.  A Conceptual Framework for Camera Phone-Based Interaction Techniques , 2005, Pervasive.

[12]  Ehud Sharlin,et al.  Photogeist: an augmented reality photography game , 2008, ACE '08.

[13]  Gerhard P. Hancke,et al.  Gesture recognition as ubiquitous input for mobile phones , 2008 .

[14]  Lieu-Hen Chen,et al.  A remote Chinese chess game using mobile phone augmented reality , 2008, ACE '08.

[15]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[16]  Jr. G. Forney,et al.  Viterbi Algorithm , 1973, Encyclopedia of Machine Learning.

[17]  Mario Hernández-Tejera,et al.  Hand Gesture Recognition for Human-Machine Interaction , 2004, WSCG.

[18]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[19]  Charles Woodward,et al.  Camera-based interactions for augmented reality , 2009, Advances in Computer Entertainment Technology.