Optical Touch Sensing on Nonparametric Rear-Projection Surfaces for Interactive Physical-Virtual Experiences

We demonstrate a generalizable method for unified multitouch detection and response on various nonparametric and parametric surfaces to support interactive physical-virtual experiences. The method employs multiple infrared (IR) cameras, one or more projectors, IR light sources, and a rear-projection surface. IR light reflected off human fingers is captured by cameras with matched IR pass filters, allowing for the detection and localization of multiple simultaneous finger-touch events. The processing of these events is tightly coupled with the rendering system to produce auditory and visual responses displayed on the surface using the projector(s) to achieve a responsive, interactive, physical–virtual experience. We demonstrate the method on two nonparametric face-shaped surfaces and a planar surface. We also illustrate the approach's applicability in an interactive medical training scenario using one of the head surfaces to support hands-on, touch-sensitive medical training with dynamic physical–virtual patient behavior.

[1]  Xiang Cao,et al.  Mouse 2.0: multi-touch meets the mouse , 2009, UIST '09.

[2]  Ravin Balakrishnan,et al.  Sphere: multi-touch interactions on a spherical display , 2008, UIST '08.

[3]  Jun Rekimoto,et al.  HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.

[4]  Xiang Cao,et al.  Detecting and leveraging finger orientation for interaction with direct-touch surfaces , 2009, UIST '09.

[5]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[6]  Mary C. Whitton,et al.  Shader Lamps Virtual Patients: the Physical Manifestation of Virtual Patients , 2012, MMVR.

[7]  Benjamin Lok,et al.  Virtual Human + Tangible Interface = Mixed Reality Human An Initial Exploration with a Virtual Breast Exam Patient , 2008, 2008 IEEE Virtual Reality Conference.

[8]  Andrew Zisserman,et al.  Multiple View Geometry in Computer Vision (2nd ed) , 2003 .

[9]  Patrick Baudisch,et al.  Touch input on curved surfaces , 2011, CHI.

[10]  Benjamin Lok,et al.  Exploring Agent Physicality and Social Presence for Medical Team Training , 2013, PRESENCE: Teleoperators and Virtual Environments.

[11]  Hrvoje Benko,et al.  Beyond flat surface computing: challenges of depth-aware and curved interfaces , 2009, ACM Multimedia.

[12]  Geehyuk Lee,et al.  TouchString: a flexible linear multi-touch sensor for prototyping a freeform multi-touch surface , 2011, UIST '11 Adjunct.

[13]  Michael S. Brown,et al.  Practical Multi-Projector Display Design , 2007 .

[14]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[15]  Arjun Nagendran,et al.  Touch sensing on non-parametric rear-projection surfaces: A physical-virtual head for hands-on healthcare training , 2015, 2015 IEEE Virtual Reality (VR).

[16]  C. Price,et al.  Stroke and TIA Assessment Training: A New Simulation-Based Approach to Teaching Acute Stroke Assessment , 2012, Simulation in healthcare : journal of the Society for Simulation in Healthcare.

[17]  Kelly L. Murdock,et al.  Edgeloop Character Modeling For 3D Professionals Only , 2006 .

[18]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[19]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[20]  Greg Welch,et al.  Animatronic shader lamps avatars , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[21]  Joaquim Salvi,et al.  Pattern codification strategies in structured light systems , 2004, Pattern Recognit..

[22]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.