Multi-modal approach to evaluate adaptive visual stimuli of remote operation system using gaze behavior
暂无分享,去创建一个
[1] Mitsuru Kawamoto,et al. Development of A Virtual Environment to Realize Human-Machine Interaction of Forklift Operation , 2019, 2019 7th International Conference on Robot Intelligence Technology and Applications (RiTA).
[2] Shahrokh Nikou,et al. Usability and UX of Learning Management Systems: An Eye- Tracking Approach , 2020, 2020 IEEE International Conference on Engineering, Technology and Innovation (ICE/ITMC).
[3] Pradipta Biswas,et al. Eye Gaze Controlled Robotic Arm for Persons with SSMI , 2020, ArXiv.
[4] Melissa Patricia Coral. Analyzing Cognitive Workload through Eye-Related Measurements: A Meta-Analysis , 2016 .
[5] Ralph Schroeder,et al. Small-Group Behavior in a Virtual and Real Environment: A Comparative Study , 2000, Presence: Teleoperators & Virtual Environments.
[6] Shengyuan Yan,et al. Evaluation and prediction mental workload in user interface of maritime operations using eye response , 2019, International Journal of Industrial Ergonomics.
[7] Ulf Ahlstrom,et al. Using eye movement activity as a correlate of cognitive workload , 2006 .
[8] Hiromasa Suzuki,et al. Glance behavior as design indices of in-vehicle visual support system: A study using crane simulators. , 2018, Applied ergonomics.
[9] Azizah Jaafar,et al. Eye Tracking in Educational Games Environment: Evaluating User Interface Design through Eye Tracking Patterns , 2011, IVIC.
[10] André P. Calitz,et al. The evaluation of an adaptive user interface model , 2010, SAICSIT '10.
[11] Krzysztof Z. Gajos,et al. Predictability and accuracy in adaptive user interfaces , 2008, CHI.
[12] Syed Khuram Shahzad,et al. Usability evaluation of adaptive features in smartphones , 2017, KES.
[13] Mohammed Lataifeh,et al. Adaptive user interface design and analysis using emotion recognition through facial expressions and body posture from an RGB-D sensor , 2020, PloS one.
[14] David M. Allen,et al. The Relationship Between Variable Selection and Data Agumentation and a Method for Prediction , 1974 .
[15] Hung-Lin Chi,et al. Development of user interface for tele-operated cranes , 2012, Adv. Eng. Informatics.
[16] Johannes Fottner,et al. Evaluation of Remote Crane Operation with an Intuitive Tablet Interface and Boom Tip Control , 2020, 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC).
[17] Qian Yang,et al. A Long-Term Evaluation of Adaptive Interface Design for Mobile Transit Information , 2020, MobileHCI.
[18] Kiyoshi Kiyokawa,et al. Adaptive View Management for Drone Teleoperation in Complex 3D Structures , 2017, IUI.
[19] Eiichi Yoshida,et al. Modeling Viewpoint of Forklift Operators Using Context-Based Clustering of Gaze Fixations , 2021, HCI.
[20] Sabri Ahmad,et al. Stepwise Multiple Regression Method to Forecast Fish Landing , 2010 .
[21] Daisuke Kurabayashi,et al. Embodiment Sensing for Self-generated Zigzag Turning Algorithm Using Vision-Based Plume Diffusion , 2014, SIMPAR.
[22] Pradipta Biswas,et al. A case study of developing gaze controlled interface for users with severe speech and motor impairment , 2019 .
[23] Sara L. Riggs,et al. Effects of Workload and Workload Transitions on Attention Allocation in a Dual-Task Environment: Evidence From Eye Tracking Metrics , 2020 .
[24] Klaus Bengler,et al. The relationship between pilots' manual flying skills and their visual behavior: a flight simulator study using eye tracking , 2012 .
[25] Frank Schieber,et al. Visual Entropy Metric Reveals Differences in Drivers' Eye Gaze Complexity across Variations in Age and Subsidiary Task Load , 2008 .
[26] Shoichi Maeyama,et al. View Point Decision Algorithm for an Autonomous Robot to Provide Support Images in the Operability of a Teleoperated Robot , 2016 .
[27] Koichi Ohtomi,et al. Skill Metrics for Mobile Crane Operators Based on Gaze Fixation Pattern , 2017 .
[28] Jelle Saldien,et al. Mobile pupillometry in manual assembly: A pilot study exploring the wearability and external validity of a renowned mental workload lab measure , 2020, International Journal of Industrial Ergonomics.
[29] Gowdham Prabhakar,et al. Detecting drivers’ cognitive load from saccadic intrusion , 2018 .
[30] Joanna McGrenere,et al. Impact of screen size on performance, awareness, and user satisfaction with adaptive graphical user interfaces , 2008, CHI.
[31] Tian Zhou,et al. Eye-Tracking Metrics Predict Perceived Workload in Robotic Surgical Skills Training , 2019, Hum. Factors.
[32] Daisuke Kurabayashi,et al. Quantitative Analysis of the Silk Moth’s Chemical Plume Tracing Locomotion Using a Hierarchical Classification Method , 2014 .
[33] Wen-Chin Li,et al. The evaluation of pilots performance and mental workload by eye movement , 2012 .
[34] E. Buffalo,et al. A nonparametric method for detecting fixations and saccades using cluster analysis: Removing the need for arbitrary thresholds , 2014, Journal of Neuroscience Methods.
[35] Krzysztof Z. Gajos,et al. Design Space and Evaluation Challenges of Adaptive Graphical User Interfaces , 2009, AI Mag..
[36] Ping-Rui Tsai,et al. Categorizing SHR and WKY rats by chi2 algorithm and decision tree , 2020, Scientific Reports.
[37] Kosuke Sekiyama,et al. Visual support system for remote control by adaptive ROI selection of monitoring robot , 2018 .
[38] Hiromasa Suzuki,et al. Gaze Behavior and Emotion of Crane Operators for Different Visual Support System , 2017, HCI.
[39] A. Kramer,et al. Physiological metrics of mental workload: A review of recent progress , 1990, Multiple-task performance.
[40] Elspeth M McDougall,et al. Validation of surgical simulators. , 2007, Journal of endourology.
[41] Naoki Mizukami,et al. Japanese version of NASA Task Load Index , 1996 .
[42] Gary Smith,et al. Step away from stepwise , 2018, Journal of Big Data.