What Are Others Looking at? Exploring 360° Videos on HMDs with Visual Cues about Other Viewers

Viewing 360° videos on a head-mounted display (HMD) can be an immersive experience. However, viewers must often be guided, as the freedom to rotate the view may make them miss things. We explore a unique, automatic approach to this problem with dynamic guidance methods called social indicators. They use the viewers’ gaze data to recognize popular areas in 360° videos, which are then visualized to subsequent viewers. We developed and evaluated two different social indicators in a 30-participant user study. Although the indicators show great potential in subtly guiding users and improving the experience, finding the balance between guidance and self-exploration is vital. Also, users had varying interest towards indicators that represented a larger audience but reported a clear desire to use the indicators with their friends. We also present guidelines for providing dynamic guidance for 360° videos.

[1]  Rolf Nordahl,et al.  Missing the point: an exploration of how to guide users' attention during cinematic virtual reality , 2016, VRST.

[2]  Peter J. Passmore,et al.  Effects of Viewing Condition on User Experience of Panoramic Video , 2016, ICAT-EGVE.

[3]  Louise Barkhuus,et al.  Panoramic video: design challenges and implications for content interaction , 2013, EuroITV.

[4]  Stephen DiVerdi,et al.  CollaVR: Collaborative In-Headset Review for VR Video , 2017, UIST.

[5]  Markku Turunen,et al.  Guidelines for Designing Interactive Omnidirectional Video Applications , 2017, INTERACT.

[6]  Luca Chittaro,et al.  Visualizing locations of off-screen objects on mobile devices: a comparative evaluation of three approaches , 2006, Mobile HCI.

[7]  Noel E. O'Connor,et al.  SaltiNet: Scan-Path Prediction on 360 Degree Images Using Saliency Volumes , 2017, 2017 IEEE International Conference on Computer Vision Workshops (ICCVW).

[8]  Julian Frommel,et al.  ShareVR: Enabling Co-Located Experiences for Virtual Reality between HMD and Non-HMD Users , 2017, CHI.

[9]  Marcus A. Magnor,et al.  Subtle gaze guidance for immersive environments , 2017, SAP.

[10]  Shenghua Gao,et al.  Saliency Detection in 360 ^\circ ∘ Videos , 2018, ECCV.

[11]  Shuo Zhou,et al.  Sense of Presence, Attitude Change, Perspective-Taking and Usability in First-Person Split-Sphere 360° Video , 2018, CHI.

[12]  Markku Turunen,et al.  User Experience and Immersion of Interactive Omnidirectional Videos in CAVE Systems and Head-Mounted Displays , 2017, INTERACT.

[13]  Ming-Yu Liu,et al.  Deep 360 Pilot: Learning a Deep Agent for Piloting through 360° Sports Videos , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[14]  Min Sun,et al.  Tell Me Where to Look: Investigating Ways for Assisting Focus in 360° Video , 2017, CHI.

[15]  Bing-Yu Chen,et al.  Outside-In: Visualizing Out-of-Sight Regions-of-Interest in a 360° Video Using Spatial Picture-in-Picture Previews , 2017, UIST.

[16]  Antoine Coutrot,et al.  A dataset of head and eye movements for 360° videos , 2018, MMSys.

[17]  Shenghua Gao,et al.  Saliency Detection in 360 ◦ Videos , 2022 .

[18]  Pablo César,et al.  Experiencing Virtual Reality Together: Social VR Use Case Study , 2018, TVX.

[19]  Anthony Tang,et al.  Watching 360° Videos Together , 2017, CHI.

[20]  Carman Neustaedter,et al.  BeWithMe: An Immersive Telepresence System for Distance Separated Couples , 2017, CSCW Companion.

[21]  Barbara M. Wildemuth,et al.  Applications of Social Research Methods to Questions in Information and Library Science , 2009 .

[22]  Anthony Steed,et al.  Cinematic virtual reality: Evaluating the effect of display type on the viewing experience for panoramic video , 2017, 2017 IEEE Virtual Reality (VR).

[23]  Min Sun,et al.  Cube Padding for Weakly-Supervised Saliency Prediction in 360° Videos , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[24]  Heinrich Hußmann,et al.  Diegetic cues for guiding the viewer in cinematic virtual reality , 2017, VRST.

[25]  Licia Calvi,et al.  The Oculus Rift Film Experience: A Case Study on Understanding Films in a Head Mounted Display , 2016, INTETAIN.

[26]  Carman Neustaedter,et al.  Collaboration with 360° Videochat: Challenges and Opportunities , 2017, Conference on Designing Interactive Systems.

[27]  Patrick Le Callet,et al.  A Dataset of Head and Eye Movements for 360 Degree Images , 2017, MMSys.

[28]  Andreas Braun,et al.  Cinematic Narration in VR - Rethinking Film Conventions for 360 Degrees , 2018, HCI.

[29]  Markku Turunen,et al.  Omnidirectional Video in Museums - Authentic, Immersive and Entertaining , 2017, ACE.

[30]  Carman Neustaedter,et al.  Geocaching with a Beam: Shared Outdoor Activities through a Telepresence Robot with 360 Degree Viewing , 2018, CHI.

[31]  Mark Billinghurst,et al.  Sharedsphere: MR collaboration through shared live panorama , 2017, SIGGRAPH ASIA Emerging Technologies.

[32]  Johannes Schöning,et al.  It's All Around You: Exploring 360° Video Viewing Experiences on Mobile Devices , 2017, ACM Multimedia.

[33]  Markku Turunen,et al.  Effect of gender on immersion in collaborative iODV applications , 2017, MUM.

[34]  Joanne Moore,et al.  Was I There?: Impact of Platform and Headphones on 360 Video Immersion , 2017, CHI Extended Abstracts.

[35]  Pattie Maes,et al.  Your Place and Mine: Designing a Shared VR Experience for Remotely Located Users , 2018, Conference on Designing Interactive Systems.

[36]  Heinrich Hußmann,et al.  Guiding the Viewer in Cinematic Virtual Reality by Diegetic Cues , 2018, AVR.

[37]  Martin Kraus,et al.  A comparison of head-mounted and hand-held displays for 360° videos with focus on attitude and behavior change , 2016, MindTrek.

[38]  Gordon Wetzstein,et al.  Saliency in VR: How Do People Explore Virtual Environments? , 2016, IEEE Transactions on Visualization and Computer Graphics.

[39]  Yan Zhang,et al.  Qualitative Analysis of Content by , 2005 .

[40]  Ben A. M. Schouten,et al.  Project Orpheus A Research Study into 360° Cinematic VR , 2017, TVX.