Lucid touch: a see-through mobile device

Touch is a compelling input modality for interactive devices; however, touch input on the small screen of a mobile device is problematic because a user's fingers occlude the graphical elements he wishes to work with. In this paper, we present LucidTouch, a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device. The key to making this usable is what we call pseudo-transparency: by overlaying an image of the user's hands onto the screen, we create the illusion of the mobile device itself being semi-transparent. This pseudo-transparency allows users to accurately acquire targets while not occluding the screen with their fingers and hand. Lucid Touch also supports multi-touch input, allowing users to operate the device simultaneously with all 10 fingers. We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front, due to reduced occlusion, higher precision, and the ability to make multi-finger input.

[1]  Daniel Vogel,et al.  Shift: a technique for operating pen-based interfaces using touch , 2007, CHI.

[2]  H. Laborit,et al.  [Experimental study]. , 1958, Bulletin mensuel - Societe de medecine militaire francaise.

[3]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[4]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[5]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI 1994.

[6]  John C. Tang,et al.  Videodraw: a video interface for collaborative drawing , 1991, TOIS.

[7]  Darren Leigh,et al.  Under the table interaction , 2006, UIST.

[8]  Ravin Balakrishnan,et al.  Exploring bimanual camera control and object manipulation in 3D graphics interfaces , 1999, CHI '99.

[9]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[10]  Masanori Sugimoto,et al.  HybridTouch: an intuitive manipulation technique for PDAs using their front and rear surfaces , 2006, Mobile HCI.

[11]  Stuart K. Card,et al.  Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys, for text selection on a CRT , 1987 .

[12]  Patrick Baudisch,et al.  Precise selection techniques for multi-touch screens , 2006, CHI.

[13]  Ronald Azuma,et al.  A survey of augmented reality" Presence: Teleoperators and virtual environments , 1997 .

[14]  Mark D. Corner,et al.  Proceedings of the 9th Workshop on Mobile Computing Systems and Applications, HotMobile 2008, Napa Valley, California, USA, February 25-26, 2008 , 2008, HotMobile.

[15]  Carl Gutwin,et al.  Multiblending: displaying overlapping windows simultaneously without the drawbacks of alpha blending , 2004, CHI.

[16]  Bill N. Schilit,et al.  Context-aware computing applications , 1994, Workshop on Mobile Computing Systems and Applications.

[17]  B. Shneiderman,et al.  Improving the accuracy of touch screens: an experimental evaluation of three strategies , 1988, CHI '88.

[18]  Shumin Zhai,et al.  The metropolis keyboard - an exploration of quantitative techniques for virtual keyboard design , 2000, UIST '00.

[19]  Alan F. Blackwell,et al.  Dasher—a data entry interface using continuous gestures and language models , 2000, UIST '00.

[20]  Daniel Vogel,et al.  HybridPointing: fluid switching between absolute and relative pointing with a direct input device , 2006, UIST.

[21]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[22]  Ben Shneiderman,et al.  High Precision Touchscreens: Design Strategies and Comparisons with a Mouse , 1991, Int. J. Man Mach. Stud..

[23]  Steve Howard,et al.  Human-Computer Interaction INTERACT ’97 , 1997, IFIP — The International Federation for Information Processing.

[24]  William Buxton,et al.  A three-state model of graphical input , 1990, INTERACT.

[25]  Carman Neustaedter,et al.  VideoArms: Supporting Remote Embodiment in Groupware , 2004 .

[26]  Shumin Zhai,et al.  High precision touch screen interaction , 2003, CHI '03.

[27]  I. Scott MacKenzie,et al.  A Model of Two-Thumb Text Entry , 2002, Graphics Interface.

[28]  Shumin Zhai,et al.  Shorthand writing on stylus keyboard , 2003, CHI '03.

[29]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[30]  Alan Esenther,et al.  Fluid DTMouse: better mouse support for touch-based interactions , 2006, AVI '06.

[31]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.

[32]  Hiroshi Ishii,et al.  The metaDESK: models and prototypes for tangible user interfaces , 1997, UIST '97.

[33]  Clifton Forlines,et al.  DTLens: multi-user tabletop spatial data exploration , 2005, UIST.

[34]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[35]  Shumin Zhai,et al.  Manual and cognitive benefits of two-handed input: an experimental study , 1998, TCHI.