Nearmi: A Framework for Designing Point of Interest Techniques for VR Users with Limited Mobility

We propose Nearmi, a framework that enables designers to create customizable and accessible point-of-interest (POI) techniques in virtual reality (VR) for people with limited mobility. Designers can use Nearmi by creating and combining instances of its four components—representation, display, selection, and transition. These components enable users to gain awareness of POIs in virtual environments, and automatically re-orient the virtual camera toward a selected POI. We conducted a video elicitation study where 17 participants with limited mobility provided feedback on different Nearmi implementations. Although participants generally weighed the same design considerations when discussing their preferences, their choices reflected tradeoffs in accessibility, realism, spatial awareness, comfort, and familiarity with the interaction. Our findings highlight the need for accessible and customizable VR interaction techniques, as well as design considerations for building and evaluating these techniques.

[1]  Donald A. Norman,et al.  The research-practice gap: the need for translational developers , 2010, INTR.

[2]  Angela Lin,et al.  Exploring Sound Awareness in the Home for People who are Deaf or Hard of Hearing , 2019, CHI.

[3]  Tara Matthews,et al.  Visualizing non-speech sounds for the deaf , 2005, Assets '05.

[4]  Meredith Ringel Morris,et al.  Accessible by Design: An Opportunity for Virtual Reality , 2019, 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct).

[5]  Kathrin Gerling,et al.  Virtual Reality Games for People Using Wheelchairs , 2020, CHI.

[6]  Miguel A. Nacenta,et al.  A Comparison of Guiding Techniques for Out-of-View Objects in Full-Coverage Displays , 2019 .

[7]  Jakob Nielsen,et al.  Heuristic Evaluation of Prototypes (individual) , 2022 .

[8]  Eric D. Ragan,et al.  Amplified Head Rotation in Virtual Reality and the Effects on 3D Search, Training Transfer, and Spatial Orientation , 2017, IEEE Transactions on Visualization and Computer Graphics.

[9]  Wilko Heuten,et al.  Identification of Out-of-View Objects in Virtual Reality , 2018, SUI.

[10]  John C. Tang,et al.  “I just went into it assuming that I wouldn't be able to have the full experience”: Understanding the Accessibility of Virtual Reality for People with Limited Mobility , 2020, ASSETS.

[11]  Carl Gutwin,et al.  Improving selection of off-screen targets with hopping , 2006, CHI.

[12]  Kenji Suzuki,et al.  Light-Emitting Device for Supporting Auditory Awareness of Hearing-Impaired People during Group Conversations , 2013, 2013 IEEE International Conference on Systems, Man, and Cybernetics.

[13]  Michael Prilla,et al.  Getting out of Out of Sight: Evaluation of AR Mechanisms for Awareness and Orientation Support in Occluded Multi-Room Settings , 2020, CHI.

[14]  Krzysztof Z. Gajos,et al.  Ability-based design , 2018, Commun. ACM.

[15]  Carl Gutwin,et al.  Wedge: clutter-free visualization of off-screen locations , 2008, CHI.

[16]  R. Weiss Learning from strangers : the art and method of qualitative interview studies , 1995 .

[17]  Hai-Ning Liang,et al.  EdgeSplit: facilitating the selection of off-screen objects , 2012, Mobile HCI.

[18]  Eelke Folmer,et al.  Virtual Locomotion: A Survey , 2020, IEEE Transactions on Visualization and Computer Graphics.

[19]  M. Patton Qualitative Research & Evaluation Methods: Integrating Theory and Practice , 2014 .

[20]  Fred D. Davis A technology acceptance model for empirically testing new end-user information systems : theory and results , 1985 .

[21]  Susanne Boll,et al.  Beyond Halo and Wedge: visualizing out-of-view objects on head-mounted virtual and augmented reality devices , 2018, MobileHCI.

[22]  Kathrin Maria Gerling,et al.  A Critical Examination of Virtual Reality Technology in the Context of the Minority Body , 2021, CHI.

[23]  Bing-Yu Chen,et al.  Outside-In: Visualizing Out-of-Sight Regions-of-Interest in a 360° Video Using Spatial Picture-in-Picture Previews , 2017, UIST.

[24]  Doug A. Bowman,et al.  An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments , 1997, SI3D.

[25]  Björn Hartmann,et al.  Shot Orientation Controls for Interactive Cinematography with 360 Video , 2017, UIST.

[26]  Eric D. Ragan,et al.  Evaluating joystick control for view rotation in virtual reality with continuous turning, discrete turning, and field-of-view reduction , 2018, IWISC.

[27]  Eric D. Ragan,et al.  Trade-Offs Related to Travel Techniques and Level of Display Fidelity in Virtual Data-Analysis Environments , 2012, ICAT/EGVE/EuroVR.

[28]  Susanne Boll,et al.  Visualizing out-of-view objects in head-mounted augmented reality , 2017, MobileHCI.

[29]  John C. Hart,et al.  The CAVE: audio visual experience automatic virtual environment , 1992, CACM.

[30]  Lisa Anthony,et al.  Analyzing user-generated youtube videos to understand touchscreen use by people with motor impairments , 2013, CHI.

[31]  Uran Oh,et al.  The challenges and potential of end-user gesture customization , 2013, CHI.

[32]  Jason Jerald,et al.  The VR Book: Human-Centered Design for Virtual Reality , 2015 .

[33]  James A. Landay,et al.  Can you see what i hear?: the design and evaluation of a peripheral sound display for the deaf , 2003, CHI '03.

[34]  Susanne Boll,et al.  EyeSee360: designing a visualization technique for out-of-view objects in head-mounted augmented reality , 2017, SUI.

[35]  Patrick Baudisch,et al.  Halo: a Technique for Visualizing Off-Screen Locations , 2003 .

[36]  Valeria Herskovic,et al.  SidebARs: improving awareness of off-screen elements in mobile augmented reality , 2013, ChileCHI '13.

[37]  Jacob O. Wobbrock,et al.  Situationally aware mobile devices for overcoming situational impairments , 2019, EICS.

[38]  Jon Froehlich,et al.  Head-Mounted Display Visualizations to Support Sound Awareness for the Deaf and Hard of Hearing , 2015, CHI.

[39]  Eric D. Ragan,et al.  Guided head rotation and amplified head rotation: Evaluating semi-natural travel and viewing techniques in virtual reality , 2017, 2017 IEEE Virtual Reality (VR).

[40]  Tara Matthews,et al.  Evaluating non-speech sound visualizations for the deaf , 2006, Behav. Inf. Technol..

[41]  A. Kable,et al.  Visual Methodologies in Qualitative Research , 2017 .

[42]  Benjamin B. Bederson,et al.  A review of overview+detail, zooming, and focus+context interfaces , 2009, CSUR.

[43]  Gordon B. Davis,et al.  User Acceptance of Information Technology: Toward a Unified View , 2003, MIS Q..