Extension of Projection Area using Head Orientation in Projected Virtual Hand Interface for Wheelchair Users

A projected virtual hand interface, which visually extends arms of users using projection images, enables wheelchair users to reach unreachable objects; however, its projection area is narrower than the reaching area required for the users. To address this problem, we propose a wheelchair system enhanced with a projected virtual hand that allows controlling a projection area using user’s head orientation. The proposed system estimates the current orientation of a user’s head and controls the pan and tilt of a projector accordingly to move a projection area considering the positional relationship with a projection plane. As users can operate the projection area simply by turning their head, this operation can be executed simultaneously with operations of a virtual hand using their hands. We propose a control model for projector rotation according to the user’s head direction. The conducted user experience study has revealed that the proposed method enables users to perform pointing tasks in a shorter time compared with the existing method, and moreover, it has acceptable interface usability.

[1]  Nigel Sim,et al.  The head mouse — Head gaze estimation "In-the-Wild" with low-cost inertial sensors for BMI use , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[2]  Hiroshi Murase,et al.  A Study on Gaze Estimation Using Head and Body Pose Information , 2012 .

[3]  Kosuke Sato,et al.  ExtendedHand on Wheelchair , 2016, UIST.

[4]  Jeff Sauro,et al.  Customer Analytics For Dummies , 2015 .

[5]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[6]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[7]  Robert J. Teather,et al.  The eyes don't have it: an empirical comparison of head-based and eye-based selection in virtual reality , 2017, SUI.

[8]  Atsushi Watanabe,et al.  Communicating robotic navigational intentions , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[9]  Mohd Razali Md Tomari A Framework for Controlling Wheelchair Motion by using Gaze Information , 2014 .

[10]  Kosuke Sato,et al.  Body cyberization by spatial augmented reality for reaching unreachable world , 2017, AH.

[11]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[12]  Dongwoo Lee,et al.  Design and evaluation of smart phone-based 3D interaction for large display , 2015, 2015 IEEE International Conference on Consumer Electronics (ICCE).

[13]  Emmanuel Pietriga,et al.  High-precision pointing on large wall displays using small handheld devices , 2013, CHI.

[14]  Eyal Ofek,et al.  NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers , 2016, UIST.

[15]  S. Ogawa,et al.  A reachable user interface by the graphically Extended Hand , 2012, The 1st IEEE Global Conference on Consumer Electronics 2012.

[16]  Xing-Dong Yang,et al.  Gluey: Developing a Head-Worn Display Interface to Unify the Interaction Experience in Distributed Display Environments , 2015, MobileHCI.

[17]  Maud Marchal,et al.  THING: Introducing a Tablet-based Interaction Technique for Controlling 3D Hand Models , 2015, CHI.

[18]  Mark Ashdown,et al.  Combining head tracking and mouse input for a GUI on multiple monitors , 2005, CHI Extended Abstracts.

[19]  Satoshi Iwaki,et al.  Hands-free interface for seamless pointing between physical and virtual objects , 2014, 2014 World Automation Congress (WAC).