Touchable 3D video system

Multimedia technologies are reaching the limits of providing audio-visual media that viewers consume passively. An important factor, which will ultimately enhance the user's experience in terms of impressiveness and immersion, is interaction. Among daily life interactions, haptic interaction plays a prominent role in enhancing the quality of experience of users, and in promoting physical and emotional development. Therefore, a critical step in multimedia research is expected to bring the sense of touch, or haptics, into multimedia systems and applications. This article proposes a touchable 3D video system where viewers can actively touch a video scene through a force-feedback device, and presents the underlying technologies in three functional components: (1) contents generation, (2) contents transmission, and (3) viewing and interaction. First of all, we introduce a depth image-based haptic representation (DIBHR) method that adds haptic and heightmap images, in addition to the traditional depth image-based representation (DIBR), to encode the haptic surface properties of the video media. In this representation, the haptic image contains the stiffness, static friction, and dynamic friction, whereas the heightmap image contains roughness of the video contents. Based on this representation method, we discuss how to generate synthetic and natural (real) video media through a 3D modeling tool and a depth camera, respectively. Next, we introduce a transmission mechanism based on the MPEG-4 framework where new MPEG-4 BIFS nodes are designed to describe the haptic scene. Finally, a haptic rendering algorithm to compute the interaction force between the scene and the viewer is described. As a result, the performance of the haptic rendering algorithm is evaluated in terms of computational time and smooth contact force. It operates marginally within a 1 kHz update rate that is required to provide stable interaction force and provide smoother contact force with the depth image that has high frequency geometrical noise using a median filter.

[1]  Aljoscha Smolic,et al.  Interactive 3-D Video Representation and Coding Technologies , 2005, Proceedings of the IEEE.

[2]  Jongeun Cha,et al.  Depth Video Enhancement for Haptic Interaction Using a Smooth Surface Reconstruction , 2006, IEICE Trans. Inf. Syst..

[3]  Wojciech Matusik,et al.  3D TV: a scalable system for real-time acquisition, transmission, and autostereoscopic display of dynamic scenes , 2004, ACM Trans. Graph..

[4]  Abdulmotaleb El-Saddik,et al.  DIBHR: Depth Image-Based Haptic Rendering , 2008, EuroHaptics.

[5]  Gaurav S. Sukhatme,et al.  A haptic-rendering technique based on hybrid surface representation , 2004, IEEE Computer Graphics and Applications.

[6]  Elaine Cohen,et al.  Direct haptic rendering of complex trimmed NURBS models , 1999, SIGGRAPH Courses.

[7]  Nadia Magnenat-Thalmann,et al.  Haptics in virtual reality and multimedia , 2006, IEEE MultiMedia.

[8]  Yeongmi Kim,et al.  An Authoring/Editing Framework for Haptic Broadcasting: Passive Haptic Interactions using MPEG-4 BIFS , 2007, Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC'07).

[9]  James J. Troy,et al.  Six degree-of-freedom haptic rendering using voxel sampling , 1999, SIGGRAPH.

[10]  Kirsten Rassmus-Gröhn,et al.  Supporting presence in collaborative environments by haptic force feedback , 2000, TCHI.

[11]  Jongeun Cha,et al.  Haptic Interaction in Realistic Multimedia Broadcasting , 2004, PCM.

[12]  C. Fehn Meta-Data Requiremants for EE4 in MPEG 3DAV , 2003 .

[13]  Lucy Y. Pao,et al.  Shock and vortex visualization using a combined visual/Haptic interface , 2000 .

[14]  El Saddik,et al.  The Potential of Haptics Technologies , 2007, IEEE Instrumentation & Measurement Magazine.

[15]  G. Riva,et al.  Being There: Concepts, Effects and Measurements of User Presence in Synthetic Environments , 2003 .

[16]  M. Shimojo,et al.  A system for simultaneously measuring grasping posture and pressure distribution , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[17]  Mark Lee,et al.  Review Article Tactile sensing for mechatronics—a state of the art survey , 1999 .

[18]  Abdulmotaleb El-Saddik,et al.  HAMLAT: A HAML-Based Authoring Tool for Haptic Application Development , 2008, EuroHaptics.

[19]  Ricardo S. Avila,et al.  A haptic interaction method for volume visualization , 1996, Proceedings of Seventh Annual IEEE Visualization '96.

[20]  Charles D. Hansen,et al.  A constraint-based technique for haptic volume exploration , 2003, IEEE Visualization, 2003. VIS 2003..

[21]  John Kenneth Salisbury,et al.  A constraint-based god-object method for haptic display , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[22]  John Kenneth Salisbury,et al.  Haptically Annotated Movies: Reaching Out and Touching the Silver Screen , 2006, 2006 14th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems.

[23]  In Kyu Park,et al.  Depth image-based representation and compression for static and animated 3-D objects , 2004, IEEE Transactions on Circuits and Systems for Video Technology.

[24]  Lucy Y. Pao,et al.  Shock and vortex visualization using a combined visual/haptic interface , 2000, Proceedings Visualization 2000. VIS 2000 (Cat. No.00CH37145).

[25]  Michael J. Singer,et al.  Measuring Presence in Virtual Environments: A Presence Questionnaire , 1998, Presence.

[26]  J. Gibson Observations on active touch. , 1962, Psychological review.

[27]  I. Oakley,et al.  Touch TV : Adding Feeling to Broadcast Media , 2003 .

[28]  William S. Harwin,et al.  Extending the friction cone algorithm for arbitrary polygon based haptic objects , 2004, 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings..

[29]  François Conti,et al.  CHAI: An Open-Source Library for the Rapid Development of Haptic Scenes , 2005 .

[30]  Abdulmotaleb El-Saddik,et al.  A guided tour in haptic audio visual environments and applications , 2007, Int. J. Adv. Media Commun..

[31]  Kay M. Stanney,et al.  Deriving haptic design guidelines from human physiological, psychophysical, and neurological foundations , 2004, IEEE Computer Graphics and Applications.

[32]  Anton Konushin,et al.  A Framework for Depth Image-Based Modeling and Rendering , 2003 .

[33]  Konstantinos Chorianopoulos,et al.  Learn and play with interactive TV , 2007, CIE.

[34]  Miriam Reiner,et al.  The role of haptics in immersive telecommunication environments , 2004, IEEE Transactions on Circuits and Systems for Video Technology.

[35]  Yo-Sung Ho,et al.  3D video player system with haptic interaction based on depth image-based representation , 2006, IEEE Transactions on Consumer Electronics.

[36]  Elaine Cohen,et al.  Maneuverable NURBS models within a haptic virtual environment , 1997 .

[37]  Oussama Khatib,et al.  The haptic display of complex graphical environments , 1997, SIGGRAPH.

[38]  K. Salisbury,et al.  Haptic Rendering of Surfaces Defined by Implicit Functions , 1997, Dynamic Systems and Control.

[39]  John Kenneth Salisbury,et al.  Large haptic topographic maps: marsview and the proxy graph algorithm , 2003, I3D '03.

[40]  J.B.F. van Erp,et al.  Multi-finger haptic interaction within the MIAMM project , 2002 .

[41]  Newton Lee,et al.  ACM Transactions on Multimedia Computing, Communications and Applications (ACM TOMCCAP) , 2007, CIE.

[42]  Cagatay Basdogan,et al.  Efficient Point-Based Rendering Techniques for Haptic Display of Virtual Objects , 1999, Presence.

[43]  Aljoscha Smolic,et al.  Scene Representation Technologies for 3DTV—A Survey , 2007, IEEE Transactions on Circuits and Systems for Video Technology.

[44]  Zhan Gao,et al.  Haptic B-spline Surface Sculpting with a Shaped Tool of Implicit Surface , 2005 .