Intuitive substitute interface

This paper proposes the “Substitute Interface” to utilize the flat surfaces of objects around us as part of an ad hoc mobile device. The substitute interface is established by the combination of wearable devices such as a head-mounted display with camera and a ring-type microphone. The camera recognizes which object the user intends to employ. When the user picks up and taps the object, such as a notebook, a virtual display is overlaid on the object, and the user can operate the ad hoc mobile device as if the object were part of the device. Display size can be changed easily by selecting a larger object. The user’s pointing/selection action is recognized by the combination of the camera and the ring-type microphone. We first investigate the usage scene of tablet devices and create a prototype that can operate as a tablet device. Experiments on the prototype confirm that the proposal functions as intended.

[1]  Ross T. Smith,et al.  Hand Tracking For Low Powered Mobile AR User Interfaces , 2005, AUIC.

[2]  Pattie Maes,et al.  WUW - wear Ur world: a wearable gestural interface , 2009, CHI Extended Abstracts.

[3]  Ivan Poupyrev,et al.  Virtual object manipulation on a table-top AR environment , 2000, Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).

[4]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[5]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[6]  Nikolaus F. Troje,et al.  Paper windows: interaction techniques for digital paper , 2005, CHI.

[7]  Judith Kelner,et al.  Model Based 3d Tracking Techniques for Markerless Augmented Reality , 2009 .

[8]  Tsutomu Horikoshi,et al.  The physical object interaction using a glasses-type display , 2008, SIGGRAPH '08.

[9]  Katsuhiko Sakaue,et al.  The Hand Mouse: GMM hand-color classification and mean shift tracking , 2001, Proceedings IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems.

[10]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[11]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[12]  Tobias Höllerer,et al.  Vision-based interfaces for mobility , 2004, The First Annual International Conference on Mobile and Ubiquitous Systems: Networking and Services, 2004. MOBIQUITOUS 2004..

[13]  Vincent Lepetit,et al.  Scalable real-time planar targets tracking for digilog books , 2010, The Visual Computer.

[14]  Dieter Schmalstieg,et al.  Pose tracking from natural features on mobile phones , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.