User Experiences in ThrowIt: A Natural UI for Sharing Objects between Mobile Devices

The success of context-aware systems depends on intuitive user interface designs which facilitate communication between multiple devices. This paper presents Throw It, a natural user interface for sharing objects between mobile devices using gaze and hand gestures. For enabling gesture control, Throw It includes an off-the-shelf monochrome camera with infrared lights and a hand-held 6 degree of freedom sensor. To share an object, the user first locks their gaze on the object, then selects the object by doing a short 'grab' gesture, and finally, sends the object to another mobile device by doing a 'throwing' gesture. The mobile devices locate each other via a context server. According to the results of our user study, the idea of sharing digital objects using Throw It was found both useful and fascinating. However, sharing objects to many people at the same time raised security concerns.

[1]  Timo Ojala,et al.  MobiToss: a novel gesture based interface for creating and sharing mobile multimedia art on large public displays , 2008, ACM Multimedia.

[2]  Yang Li,et al.  Experimental analysis of touch-screen gesture designs in mobile environments , 2011, CHI.

[3]  Yang Li,et al.  Protractor: a fast and accurate gesture recognizer , 2010, CHI.

[4]  Lik-Kwan Shark,et al.  Dynamic Hand Gesture Tracking and Recognition for Real-Time Immersive Virtual Object Manipulation , 2009, 2009 International Conference on CyberWorlds.

[5]  Du-Sik Park,et al.  3D user interface combining gaze and hand gestures for large-scale display , 2010, CHI EA '10.

[6]  Jun Rekimoto,et al.  Pick-and-drop: a direct manipulation technique for multiple computer environments , 1997, UIST '97.

[7]  Raimund Dachselt,et al.  Natural throw and tilt interaction between mobile phones and distant displays , 2009, CHI Extended Abstracts.

[8]  Moshe Eizenman,et al.  General theory of remote gaze estimation using the pupil center and corneal reflections , 2006, IEEE Transactions on Biomedical Engineering.

[9]  Jani Mäntyjärvi,et al.  Accelerometer-based gesture control for a design environment , 2006, Personal and Ubiquitous Computing.

[10]  Olli Koskenranta,et al.  A mobile client implementation for extensible network virtual worlds , 2011, MindTrek.

[11]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[12]  Veronica Sundstedt,et al.  Gaze and voice controlled drawing , 2011, NGCA '11.

[13]  Matthias Baldauf,et al.  A survey on context-aware systems , 2007, Int. J. Ad Hoc Ubiquitous Comput..

[14]  Ken Hinckley,et al.  Synchronous gestures for multiple persons and computers , 2003, UIST '03.

[15]  Masanori Sugimoto,et al.  Toss-it: intuitive information transfer techniques for mobile devices , 2005, CHI EA '05.

[16]  Luke Dahl,et al.  Sound Bounce: Physical Metaphors in Designing Mobile Music Performance , 2010, NIME.

[17]  Daniel J. Wigdor,et al.  Gesture play: motivating online gesture learning with fun, positive reinforcement and physical metaphors , 2010, ITS '10.