Multi-modal approach to evaluate adaptive visual stimuli of remote operation system using gaze behavior

Abstract A multi-modal approach is proposed to evaluate the usability of Adaptive Visual Stimuli for User Interface (AVS4UI) of remote operation systems. This study focuses on the evaluation of AVS4UI for forklift work because the operation complexity includes driving and cargo handling, which typically requires multiple salient attention. Presenting this amount of information simultaneously on a User Interface (UI) tends to cause confusions to operators and reduces operation efficiency. AVS4UI can therefore be one of the promising solutions where the optimal visual stimuli are autonomously presented for different work conditions. However, evaluation of AVS4UI is challenging because operators may be disoriented by adaptive information and worked without safety considerations. Therefore, novel gaze metrics are proposed to evaluate responses of forklift operators to AVS4UI so that undesired behavior can be evaluated. The proposed metrics implicitly represent gaze pattern in terms of transition and distribution between UI elements, operation safety, and familiarity with the adaptive system. The ideal AVS4UI is expected to minimize the proposed gaze metrics and outperform the non-adaptive UI. More importantly, the results of these metrics are consistent with those of perceived workload defined by NASA-Task Load Index. We also propose a correlation model using stepwise linear regression that provides reasonable estimation of perceived workload. Such novel metrics and correlation model enable objective and online evaluation to minimize biases of subjective response. Therefore, online work support system can be developed to support workers.

[1]  Mitsuru Kawamoto,et al.  Development of A Virtual Environment to Realize Human-Machine Interaction of Forklift Operation , 2019, 2019 7th International Conference on Robot Intelligence Technology and Applications (RiTA).

[2]  Shahrokh Nikou,et al.  Usability and UX of Learning Management Systems: An Eye- Tracking Approach , 2020, 2020 IEEE International Conference on Engineering, Technology and Innovation (ICE/ITMC).

[3]  Pradipta Biswas,et al.  Eye Gaze Controlled Robotic Arm for Persons with SSMI , 2020, ArXiv.

[4]  Melissa Patricia Coral Analyzing Cognitive Workload through Eye-Related Measurements: A Meta-Analysis , 2016 .

[5]  Ralph Schroeder,et al.  Small-Group Behavior in a Virtual and Real Environment: A Comparative Study , 2000, Presence: Teleoperators & Virtual Environments.

[6]  Shengyuan Yan,et al.  Evaluation and prediction mental workload in user interface of maritime operations using eye response , 2019, International Journal of Industrial Ergonomics.

[7]  Ulf Ahlstrom,et al.  Using eye movement activity as a correlate of cognitive workload , 2006 .

[8]  Hiromasa Suzuki,et al.  Glance behavior as design indices of in-vehicle visual support system: A study using crane simulators. , 2018, Applied ergonomics.

[9]  Azizah Jaafar,et al.  Eye Tracking in Educational Games Environment: Evaluating User Interface Design through Eye Tracking Patterns , 2011, IVIC.

[10]  André P. Calitz,et al.  The evaluation of an adaptive user interface model , 2010, SAICSIT '10.

[11]  Krzysztof Z. Gajos,et al.  Predictability and accuracy in adaptive user interfaces , 2008, CHI.

[12]  Syed Khuram Shahzad,et al.  Usability evaluation of adaptive features in smartphones , 2017, KES.

[13]  Mohammed Lataifeh,et al.  Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor , 2020, PloS one.

[14]  David M. Allen,et al.  The Relationship Between Variable Selection and Data Agumentation and a Method for Prediction , 1974 .

[15]  Hung-Lin Chi,et al.  Development of user interface for tele-operated cranes , 2012, Adv. Eng. Informatics.

[16]  Johannes Fottner,et al.  Evaluation of Remote Crane Operation with an Intuitive Tablet Interface and Boom Tip Control , 2020, 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[17]  Qian Yang,et al.  A Long-Term Evaluation of Adaptive Interface Design for Mobile Transit Information , 2020, MobileHCI.

[18]  Kiyoshi Kiyokawa,et al.  Adaptive View Management for Drone Teleoperation in Complex 3D Structures , 2017, IUI.

[19]  Eiichi Yoshida,et al.  Modeling Viewpoint of Forklift Operators Using Context-Based Clustering of Gaze Fixations , 2021, HCI.

[20]  Sabri Ahmad,et al.  Stepwise Multiple Regression Method to Forecast Fish Landing , 2010 .

[21]  Daisuke Kurabayashi,et al.  Embodiment Sensing for Self-generated Zigzag Turning Algorithm Using Vision-Based Plume Diffusion , 2014, SIMPAR.

[22]  Pradipta Biswas,et al.  A case study of developing gaze controlled interface for users with severe speech and motor impairment , 2019 .

[23]  Sara L. Riggs,et al.  Effects of Workload and Workload Transitions on Attention Allocation in a Dual-Task Environment: Evidence From Eye Tracking Metrics , 2020 .

[24]  Klaus Bengler,et al.  The relationship between pilots' manual flying skills and their visual behavior: a flight simulator study using eye tracking , 2012 .

[25]  Frank Schieber,et al.  Visual Entropy Metric Reveals Differences in Drivers' Eye Gaze Complexity across Variations in Age and Subsidiary Task Load , 2008 .

[26]  Shoichi Maeyama,et al.  View Point Decision Algorithm for an Autonomous Robot to Provide Support Images in the Operability of a Teleoperated Robot , 2016 .

[27]  Koichi Ohtomi,et al.  Skill Metrics for Mobile Crane Operators Based on Gaze Fixation Pattern , 2017 .

[28]  Jelle Saldien,et al.  Mobile pupillometry in manual assembly: A pilot study exploring the wearability and external validity of a renowned mental workload lab measure , 2020, International Journal of Industrial Ergonomics.

[29]  Gowdham Prabhakar,et al.  Detecting drivers’ cognitive load from saccadic intrusion , 2018 .

[30]  Joanna McGrenere,et al.  Impact of screen size on performance, awareness, and user satisfaction with adaptive graphical user interfaces , 2008, CHI.

[31]  Tian Zhou,et al.  Eye-Tracking Metrics Predict Perceived Workload in Robotic Surgical Skills Training , 2019, Hum. Factors.

[32]  Daisuke Kurabayashi,et al.  Quantitative Analysis of the Silk Moth’s Chemical Plume Tracing Locomotion Using a Hierarchical Classification Method , 2014 .

[33]  Wen-Chin Li,et al.  The evaluation of pilots performance and mental workload by eye movement , 2012 .

[34]  E. Buffalo,et al.  A nonparametric method for detecting fixations and saccades using cluster analysis: Removing the need for arbitrary thresholds , 2014, Journal of Neuroscience Methods.

[35]  Krzysztof Z. Gajos,et al.  Design Space and Evaluation Challenges of Adaptive Graphical User Interfaces , 2009, AI Mag..

[36]  Ping-Rui Tsai,et al.  Categorizing SHR and WKY rats by chi2 algorithm and decision tree , 2020, Scientific Reports.

[37]  Kosuke Sekiyama,et al.  Visual support system for remote control by adaptive ROI selection of monitoring robot , 2018 .

[38]  Hiromasa Suzuki,et al.  Gaze Behavior and Emotion of Crane Operators for Different Visual Support System , 2017, HCI.

[39]  A. Kramer,et al.  Physiological metrics of mental workload: A review of recent progress , 1990, Multiple-task performance.

[40]  Elspeth M McDougall,et al.  Validation of surgical simulators. , 2007, Journal of endourology.

[41]  Naoki Mizukami,et al.  Japanese version of NASA Task Load Index , 1996 .

[42]  Gary Smith,et al.  Step away from stepwise , 2018, Journal of Big Data.