Proportional Likelihood Estimation for Integrating Vibrotactile and Force Cues in 3D User Interaction

A model of integration for vibrotactile and force cues is important for facilitating human users’ task performance in human-machine systems. One of such human-machine systems is an interactive three-dimensional (3D) virtual environment (VE). In this paper, we proposed proportional likelihood estimation (PLE) as a model of integration for vibrotactile and force cues. Assuming human responses to cues as Gaussian distributions, PLE integrates these cues proportionally according to certain weighted contributions. We conducted an experiment to verify the suitability of PLE. For the experiment, we created a VE in which a human user executed interactively an identification task. The task required the user to identify visually indiscernible defects on a transmission line with a flying drone. The defects were indicated to the user through vibrotactile and/or force cues. These cues were in a co-located or dis-located setting, respectively, on the user’s right hand and/or forearm. The PLE predictions of integrating the vibrotactile and force cues were able to match the empirical observation of these combined cues. PLE also elucidated this cue integration successfully when applying to an existing dataset acquired under a different experimental condition. Further analyses revealed that the cue integration may not be entirely additive. Hence, PLE could shed a light on the cue integration for facilitating user interaction in human-machine systems, like VEs.

[1]  R. C. Oldfield The assessment and analysis of handedness: the Edinburgh inventory. , 1971, Neuropsychologia.

[2]  Robert S. Kennedy,et al.  Simulator Sickness Questionnaire: An enhanced method for quantifying simulator sickness. , 1993 .

[3]  Marcia Kilchenman O'Malley,et al.  Tactile Feedback of Object Slip Facilitates Virtual Object Manipulation , 2015, IEEE Transactions on Haptics.

[4]  Sandra G. Hart,et al.  Nasa-Task Load Index (NASA-TLX); 20 Years Later , 2006 .

[5]  Lisa Feldman Barrett,et al.  “Utilizing” Signal Detection Theory , 2014, Psychological science.

[6]  Xiaoping Hu,et al.  Activity and effective connectivity of parietal and occipital cortical regions during haptic shape perception , 2007, Neuropsychologia.

[7]  石原 忍 Tests for Colour-Blindness , 1910, Nature.

[8]  Karon E. MacLean,et al.  Designing with haptic feedback , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[9]  Frédéric Mérienne,et al.  Vibrotactile and Force Collaboration within 3D Virtual Environments , 2018, 2018 IEEE 22nd International Conference on Computer Supported Cooperative Work in Design ((CSCWD)).

[10]  James K. Brewer,et al.  Statistical Rules of Thumb , 2003 .

[11]  Allison M. Okamura,et al.  Sensory Substitution and Augmentation Using 3-Degree-of-Freedom Skin Deformation Feedback , 2015, IEEE Transactions on Haptics.

[12]  Yaoping Hu,et al.  Force and vibrotactile integration for 3D user interaction within virtual environments , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[13]  Roger Ratcliff,et al.  The Diffusion Decision Model: Theory and Data for Two-Choice Decision Tasks , 2008, Neural Computation.

[14]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[15]  Margaret J. Robertson,et al.  Design and Analysis of Experiments , 2006, Handbook of statistics.

[16]  Anatole Lécuyer,et al.  Haptic motion: Improving sensation of self-motion in virtual worlds with force feedback , 2014, 2014 IEEE Haptics Symposium (HAPTICS).

[17]  M. Ernst,et al.  Visual limitations shape audio-visual integration. , 2015, Journal of vision.