PhotoelasticTouch: transparent rubbery tangible interface using an LCD and photoelasticity

PhotoelasticTouch is a novel tabletop system designed to intuitively facilitate touch-based interaction via real objects made from transparent elastic material. The system utilizes vision-based recognition techniques and the photoelastic properties of the transparent rubber to recognize deformed regions of the elastic material. Our system works with elastic materials over a wide variety of shapes and does not require any explicit visual markers. Compared to traditional interactive surfaces, our 2.5 dimensional interface system enables direct touch interaction and soft tactile feedback. In this paper we present our force sensing technique using photoelasticity and describe the implementation of our prototype system. We also present three practical applications of PhotoelasticTouch, a force-sensitive touch panel, a tangible face application, and a paint application.

[1]  James D. Hollan,et al.  SLAP widgets: bridging the gap between virtual and physical controls on tabletops , 2009, CHI.

[2]  Ruigang Yang,et al.  Toward the Light Field Display: Autostereoscopic Rendering via a Cluster of Projectors , 2008, IEEE Transactions on Visualization and Computer Graphics.

[3]  Jun Rekimoto,et al.  HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.

[4]  Hideki Koike,et al.  Transparent 2-D markers on an LCD tabletop system , 2009, CHI.

[5]  Hiroyuki Shinoda,et al.  Airborne ultrasound tactile display , 2008, SIGGRAPH '08.

[6]  Roel Vertegaal,et al.  Organic user interfaces: designing computers in any way, shape, or form , 2007, CACM.

[7]  Hiroshi Ishii,et al.  PHOXEL-SPACE: an interface for exploring volumetric data with physical voxels , 2004, DIS '04.

[8]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[9]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[10]  Kouta Minamizawa,et al.  Haptic telexistence , 2007, SIGGRAPH Posters.

[11]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[12]  Hiroshi Ishii,et al.  Illuminating clay: a 3-D tangible interface for landscape analysis , 2002, CHI.

[13]  Kouta Minamizawa,et al.  ARForce: a marker-based augmented reality system for force distribution input , 2008, ACE '08.

[14]  Yoichi Sato,et al.  Fast tracking of hands and fingertips in infrared images for augmented desk interface , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[15]  Scott P. Robertson,et al.  Proceedings of the SIGCHI Conference on Human Factors in Computing Systems , 1991 .

[16]  Hiroshi Ishii,et al.  The metaDESK: models and prototypes for tangible user interfaces , 1997, UIST '97.

[17]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[18]  Yoichi Sato,et al.  EnhancedTable: Supporting a Small Meeting in Ubiquitous and Augmented Environment , 2004, PCM.

[19]  Takeshi Naemura,et al.  Tablescape plus: upstanding tiny displays on tabletop display , 2006, SIGGRAPH '06.

[20]  Ken Masamune,et al.  Object based image rendering and synthesis for computer generated integral videography , 2006, SIGGRAPH '06.