Immersive WYSIWYG (What You See is What You Get) Volume Visualization

Extended to immersive environment, volume visualization has the analytical superiority in spatial immersion, user engagement, multidimensional awareness and other aspects. But in a highly immersive virtual environment, traditional single-channel precise interactive methods cannot be applied directly to the immersive environment. Inspired by how users typically interact with everyday objects, a novel non-contact gesture interaction method base on What You See is What You Get(WYSIWYG)for volume rendering results is proposed in this paper. Just likes grab interaction in real scene, a full set of tools have been developed to enable direct volume rendering manipulation of color, saturation, contrast, brightness, and other optical properties by gestural motions in our method. Simultaneously, in order to improve the interactive experience in immersive environment, the evaluation model of motion comfort is introduced to design the interactive hand gestures, the cursor model is defined to estimating the gesture state combined with context gestural motions. Finally, the test platform is established with Oculus Rift + Leap Motion to verify the functionality and effectiveness of our method in improving the visual cognitive ability for volume visualization.

[1]  Michael E. Papka,et al.  Visualizing large-scale atomistic simulations in ultra-resolution immersive environments , 2013, 2013 IEEE Symposium on Large-Scale Data Analysis and Visualization (LDAV).

[2]  Xiaoru Yuan,et al.  WYSIWYG (What You See is What You Get) Volume Visualization , 2011, IEEE Transactions on Visualization and Computer Graphics.

[3]  Tobias Meisen,et al.  VPI-FP: an integrative information system for factory planning , 2016 .

[4]  Ben Horan,et al.  Taking the LEAP with the Oculus HMD and CAD - Plucking at thin Air? , 2015 .

[5]  Tobias Isenberg,et al.  Towards An Understanding of Mobile Touch Navigation in a Stereoscopic Viewing Environment for 3D Data Exploration , 2016, IEEE Transactions on Visualization and Computer Graphics.

[6]  Chris Dede,et al.  Immersive Interfaces for Engagement and Learning , 2009, Science.

[7]  Matthias Klapperstück,et al.  Immersive Analytics , 2015, 2015 Big Data Visual Analytics (BDVA).

[8]  Carlo Maria Medaglia,et al.  A Usability Study of a Gesture Recognition System Applied During the Surgical Procedures , 2015, HCI.

[9]  Andreas Nürnberger,et al.  Designing gaze-supported multimodal interactions for the exploration of large image collections , 2011, NGCA '11.

[10]  Bruce H. Thomas,et al.  ImAxes: Immersive Axes as Embodied Affordances for Interactive Multivariate Data Visualisation , 2017, UIST.

[11]  Cheng Li,et al.  GlyphLens: View-Dependent Occlusion Management in the Interactive Glyph Visualization , 2017, IEEE Transactions on Visualization and Computer Graphics.

[12]  Uta Hinrichs,et al.  SpiderEyes: designing attention- and proximity-aware collaborative interfaces for wall-sized displays , 2014, IUI.

[13]  Niklas Elmqvist,et al.  Fluid interaction for information visualization , 2011, Inf. Vis..

[14]  Chris North,et al.  Move to improve: promoting physical navigation to increase user performance with large displays , 2007, CHI.

[15]  Paulo Dias,et al.  Gesture Interactions for Virtual Immersive Environments: Navigation, Selection and Manipulation , 2016, HCI.

[16]  Fotis Sotiropoulos,et al.  Interactive Slice WIM: Navigating and Interrogating Volume Data Sets Using a Multisurface, Multitouch VR Interface , 2012, IEEE Transactions on Visualization and Computer Graphics.

[17]  Peng Song,et al.  A handle bar metaphor for virtual object manipulation with mid-air interaction , 2012, CHI.

[18]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[19]  Paul Hayden,et al.  Immersive visualization for enhanced computational fluid dynamics analysis. , 2015, Journal of biomechanical engineering.