Extending Upper Limb User Interactions in AR, VR and MR Headsets Employing a Custom-Made Wearable Device

Upper limb interactions play an important role in virtual, augmented, and mixed reality scenarios. Numerous sensors including optical, magnetic, mechanical, and myography, have been employed to provide more natural interactions in comparison to game controllers. Recently, virtual, augmented, and mixed reality headsets have started embedding hand tracking sensors to simplify the hardware requirements, thus easing operation and setup. The tracking integration is being referred to as inside/out tracking, whereby hand and eye tracking interactions without external sensors or controllers provide interactive freedom. However, the motion capture area of the inside/out sensors is limited to the field of view and technical features of the cameras, and restricted to a few hand tracking gestures. In this paper, we introduce a custom upper limb motion tracking device that extends the user's interaction range while employing a virtual, augmented, or mixed reality headset. Our 3D motion tracking system is a compact wireless wearable prototype that uses inertial measurement units providing orientation and position data employed for upper limb user interaction outside the field of view of the inside/out sensors.

[1]  Jun Lee,et al.  AnywhereTouch: Finger Tracking Method on Arbitrary Surface Using Nailed-Mounted IMU for Mobile HMD , 2017, HCI.

[2]  R. Satava,et al.  Virtual Reality Training Improves Operating Room Performance: Results of a Randomized, Double-Blinded Study , 2002, Annals of surgery.

[3]  P. S. Archambault,et al.  Evaluation of Kinect skeletal tracking in a virtual reality rehabilitation system for upper limb hemiparesis , 2013, 2013 International Conference on Virtual Rehabilitation (ICVR).

[4]  Krishna Chaithanya Mathi Augment HoloLens’ Body Recognition and Tracking Capabilities Using Kinect , 2016 .

[5]  Eliot Winer,et al.  Evaluating the Microsoft HoloLens through an augmented reality assembly application , 2017, Defense + Security.

[6]  Yasuo Kuniyoshi,et al.  Wearable motion capture suit with full-body tactile sensors , 2009, 2009 IEEE International Conference on Robotics and Automation.

[7]  Omar Addam,et al.  A Toolkit for Building Collaborative Immersive Multi-Surface Applications , 2016, ISS.

[8]  Oren Tepper,et al.  Abstract: Optimizing Measurements in Plastic Surgery through Holograms with Microsoft Hololens , 2017, Plastic and Reconstructive Surgery Global Open.

[9]  Tae-Seong Kim,et al.  3-D hand motion tracking and gesture recognition using a data glove , 2009, 2009 IEEE International Symposium on Industrial Electronics.

[10]  Judy M. Vance,et al.  Industry use of virtual reality in product design and manufacturing: a survey , 2017, Virtual Reality.

[11]  Alvaro Uribe-Quevedo,et al.  A Comparison of Seated and Room-Scale Virtual Reality in a Serious Game for Epidural Preparation , 2020, IEEE Transactions on Emerging Topics in Computing.

[12]  Justin Cobb,et al.  O100: Validation of the precision of the Microsoft HoloLens augmented reality headset head and hand motion measurement , 2017 .

[13]  Felix Raymond,et al.  Touch hologram in mid-air , 2017, SIGGRAPH Emerging Technologies.

[14]  Jon Froehlich,et al.  Augmented Reality Magnification for Low Vision Users with the Microsoft Hololens and a Finger-Worn Camera , 2017, ASSETS.

[15]  Fabrizio Lamberti,et al.  Holo-BLSD: an Augmented Reality self-directed learning and evaluation system for effective Basic Life Support Defibrillation training , 2018 .

[16]  G Rau,et al.  Movement biomechanics goes upwards: from the leg to the arm. , 2000, Journal of biomechanics.

[17]  Tim Marsh,et al.  Shaping Attitudes Across Realities. Exploring Strategies for the Design of Persuasive Virtual, Augmented and Mixed Reality Games , 2017, ICEC.

[18]  K. Weimer,et al.  Mixed Reality with HoloLens: Where Virtual Reality Meets Augmented Reality in the Operating Room , 2017, Plastic and reconstructive surgery.

[19]  P. Milgram,et al.  A Taxonomy of Mixed Reality Visual Displays , 1994 .

[20]  Avikarsha Mandal,et al.  Possible applications of the LEAP motion controller for more interactive simulated experiments in augmented or virtual reality , 2016, Optical Engineering + Applications.

[21]  William R. Sherman,et al.  Understanding Virtual RealityInterface, Application, and Design , 2002, Presence: Teleoperators & Virtual Environments.

[22]  Jack M. Loomis,et al.  Presence in Virtual Reality and Everyday Life: Immersion within a World of Representation , 2016, PRESENCE: Teleoperators and Virtual Environments.

[23]  Shuxiang Guo,et al.  Development of a real-time upper limb's motion tracking exoskeleton device for active rehabilitation using an inertia sensor , 2011, 2011 9th World Congress on Intelligent Control and Automation.