Selective visuo-haptic rendering of heterogeneous objects in “parallel universes”

Haptic rendering of complex models is usually prohibitive due to its much higher update rate requirement compared to visual rendering. Previous works have tried to solve this issue by introducing local simulation or multi-rate simulation for the two pipelines. Although these works have improved the capacity of haptic rendering pipeline, they did not take into consideration the situation of heterogeneous objects in one scenario, where rigid objects and deformable objects coexist in one scenario and close to each other. In this paper, we propose a novel idea to support interactive visuo-haptic rendering of complex heterogeneous models. The idea incorporates different collision detection and response algorithms and have them seamlessly switched on and off on the fly, as the HIP travels in the scenario. The selection of rendered models is based on the hypothesis of “parallel universes”, where the transition of rendering one group of models to another is totally transparent to users. To facilitate this idea, we proposed a procedure to convert the traditional single universe scenario into a “multiverse” scenario, where the original models are grouped and split into each parallel universe, depending on the scenario rendering requirement rather than just locality. We also proposed to add simplified visual objects as background avatars in each parallel universe to visually maintain the original scenario while not overly increase the scenario complexity. We tested the proposed idea in a haptically-enabled needle thoracostomy training environment and the result demonstrates that our idea is able to substantially accelerate visuo-haptic rendering with complex heterogeneous scenario objects.

[1]  Kup-Sze Choi,et al.  Haptic Rendering in Interactive Applications Developed with Commodity Physics Engine , 2011, J. Multim..

[2]  François Conti,et al.  CHAI: An Open-Source Library for the Rapid Development of Haptic Scenes , 2005 .

[3]  J. Edward Colgate,et al.  Haptic interfaces for virtual environment and teleoperator systems , 1995 .

[4]  Suvranu De,et al.  GPU‐based efficient realistic techniques for bleeding and smoke generation in surgical simulators , 2010, The international journal of medical robotics + computer assisted surgery : MRCAS.

[5]  Joseph S. B. Mitchell,et al.  Efficient Collision Detection Using Bounding Volume Hierarchies of k-DOPs , 1998, IEEE Trans. Vis. Comput. Graph..

[6]  Dinesh Manocha,et al.  Six-degree-of-freedom haptic display using localized contact computations , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[7]  Saeid Nahavandi,et al.  Function-Based Single and Dual Point Haptic Interaction in Cyberworlds , 2012, Trans. Comput. Sci..

[8]  Christian Duriez,et al.  GPU-based real-time soft tissue deformation with cutting and haptic feedback. , 2010, Progress in biophysics and molecular biology.

[9]  Alexei Sourin,et al.  Function-based approach to mixed haptic effects rendering , 2011, The Visual Computer.

[10]  M. Levas OBBTree : A Hierarchical Structure for Rapid Interference Detection , .

[11]  Xin Zhang,et al.  Configuration-Based Optimization for Six Degree-of-Freedom Haptic Rendering for Fine Manipulation , 2011, IEEE Transactions on Haptics.

[12]  Gino van den Bergen Efficient Collision Detection of Complex Deformable Models using AABB Trees , 1997, J. Graphics, GPU, & Game Tools.

[13]  Ming C. Lin,et al.  Sensation preserving simplification for haptic rendering , 2003, ACM Trans. Graph..

[14]  Jian Zhang,et al.  Haptic subdivision: an approach to defining level-of-detail in haptic rendering , 2002, Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002.

[15]  Gaurav S. Sukhatme,et al.  An implicit-based haptic rendering technique , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.