Augmented Reality Views for Occluded Interaction

We rely on our sight when manipulating objects. When objects are occluded, manipulation becomes difficult. Such occluded objects can be shown via augmented reality to re-enable visual guidance. However, it is unclear how to do so to best support object manipulation. We compare four views of occluded objects and their effect on performance and satisfaction across a set of everyday manipulation tasks of varying complexity. The best performing views were a see-through view and a displaced 3D view. The former enabled participants to observe the manipulated object through the occluder, while the latter showed the 3D view of the manipulated object offset from the object's real location. The worst performing view showed remote imagery from a simulated hand-mounted camera. Our results suggest that alignment of virtual objects with their real-world location is less important than an appropriate point-of-view and view stability.

[1]  L. Jakobson,et al.  Trajectories of reaches to prismatically-displaced targets: evidence for “automatic” visuomotor recalibration , 2004, Experimental Brain Research.

[2]  Aaron M. Dollar,et al.  Classifying human manipulation behavior , 2011, 2011 IEEE International Conference on Rehabilitation Robotics.

[3]  R B Welch,et al.  Prism adaptation and brain damage. , 1972, Neuropsychologia.

[4]  Eyal Ofek,et al.  Room2Room: Enabling Life-Size Telepresence in a Projected Augmented Reality Environment , 2016, CSCW.

[5]  Yves Rossetti,et al.  Prismatic displacement of vision induces transient changes in the timing of eye-hand coordination , 1993, Perception & psychophysics.

[6]  Jonna Häkkilä,et al.  Windows to other places: exploring solutions for seeing through walls using handheld projection , 2014, NordiCHI.

[7]  Dieter Schmalstieg,et al.  OmniKinect: real-time dense volumetric data acquisition and applications , 2012, VRST '12.

[8]  David Lindlbauer,et al.  Remixed Reality: Manipulating Space and Time in Augmented Reality , 2018, CHI.

[9]  Nassir Navab,et al.  The Virtual Mirror: A New Interaction Paradigm for Augmented Reality Environments , 2009, IEEE Transactions on Medical Imaging.

[10]  Selim Balcisoy,et al.  Evaluation of X-ray visualization techniques for vertical depth judgments in underground exploration , 2018, The Visual Computer.

[11]  Oskar von Stryk,et al.  Achieving versatile manipulation tasks with unknown objects by supervised humanoid robots based on object templates , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[12]  Patrick Olivier,et al.  Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor , 2012, UIST.

[13]  Hideaki Kuzuoka,et al.  Remote collaboration using a shoulder-worn active camera/laser , 2004, Eighth International Symposium on Wearable Computers.

[14]  Fei Liu,et al.  Precision study on augmented reality-based visual guidance for facility management tasks , 2018 .

[15]  Colin Ware,et al.  Eye-hand co-ordination with force feedback , 2000, CHI.

[16]  R. Held,et al.  Adaptation to displaced and delayed visual feedback from the hand. , 1966 .

[17]  Paul Coulton,et al.  Evaluating dual-view perceptual issues in handheld augmented reality: device vs. user perspective rendering , 2013, ICMI '13.

[18]  Jeff Rose,et al.  Rotating virtual objects with real handles , 1999, TCHI.

[19]  Tsuneo Yoshikawa,et al.  Force control of robot manipulators , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[20]  Atsushi Yamashita,et al.  Half-diminished reality image using three RGB-D sensors for remote control robots , 2014, 2014 IEEE International Symposium on Safety, Security, and Rescue Robotics (2014).

[21]  Kosuke Sato,et al.  Document search support by making physical documents transparent in projection-based mixed reality , 2011, Virtual Reality.

[22]  Nassir Navab,et al.  Action- and Workflow-Driven Augmented Reality for Computer-Aided Medical Procedures , 2007, IEEE Computer Graphics and Applications.

[23]  David W. Murray,et al.  On the Choice and Placement of Wearable Vision Sensors , 2009, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[24]  Jia Liu,et al.  A taxonomy of everyday grasps in action , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[25]  Robert J. Teather,et al.  Exaggerated head motions for game viewpoint control , 2008, Future Play.

[26]  Suranga Nanayakkara,et al.  Digital Digits , 2015, ACM Comput. Surv..

[27]  J. Debus,et al.  Projector-based augmented reality for intuitive intraoperative guidance in image-guided 3D interstitial brachytherapy. , 2008, International journal of radiation oncology, biology, physics.

[28]  Stefano Stramigioli,et al.  Endoscopic camera control by head movements for thoracic surgery , 2010, 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics.

[29]  P. Breedveld,et al.  Endoscopic camera rotation: a conceptual solution to improve hand-eye coordination in minimally-invasive surgery , 2000 .

[30]  Niloy J. Mitra,et al.  Surface perception of planar abstractions , 2013, TAP.

[31]  Hideo Saito,et al.  A work area visualization by multi-view camera-based diminished reality , 2017 .

[32]  Hiroyuki Egusa,et al.  Body image as a visuomotor transformation device revealed in adaptation to reversed vision , 2000, Nature.

[33]  Jon Froehlich,et al.  Augmented Reality Magnification for Low Vision Users with the Microsoft Hololens and a Finger-Worn Camera , 2017, ASSETS.

[34]  Bing Wu,et al.  FingerSight: Fingertip Haptic Sensing of the Visual Environment , 2014, IEEE Journal of Translational Engineering in Health and Medicine.

[35]  Hideo Saito,et al.  A survey of diminished reality: Techniques for visually concealing, eliminating, and seeing through real objects , 2017, IPSJ Transactions on Computer Vision and Applications.

[36]  Aaron M. Dollar,et al.  Analysis of Human Grasping Behavior: Object Characteristics and Grasp Type , 2014, IEEE Transactions on Haptics.

[37]  Takeo Kanade,et al.  Dynamic seethroughs: Synthesizing hidden views of moving objects , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[38]  Xing-Dong Yang,et al.  Magic finger: always-available input through finger instrumentation , 2012, UIST.

[39]  Stefanie Zollmann,et al.  Image-based X-ray visualization techniques for spatial understanding in outdoor augmented reality , 2014, OZCHI.