Relating Human Gaze and Manual Control Behavior in Preview Tracking Tasks with Spatial Occlusion

In manual tracking tasks with preview of the target trajectory, humans have been modeled as dual-mode "near" and "far" viewpoint controllers. This paper investigates the physical basis of these two control mechanisms, and studies whether estimated viewpoint positions represent those parts of the previewed trajectory which humans use for control. A combination of human gaze and control data is obtained, through an experiment which compared tracking with full preview (1.5 s), occluded preview, and no preview. System identification is applied to estimate the two look-ahead time parameters of a two-viewpoint preview model. Results show that humans focus their gaze often around the model's near-viewpoint position, and seldom at the far viewpoint. Gaze measurements may augment control data for the online identification of preview control behavior, to improve personalized monitoring or shared-control systems in vehicles.

[1]  A. Terry Bahill,et al.  Smooth pursuit eye movements in response to predictable target motions , 1983, Vision Research.

[2]  M. Land Eye movements and the control of actions in everyday life , 2006, Progress in Retinal and Eye Research.

[3]  J. Vickers,et al.  Using spatial occlusion to explore the control strategies used in rapid interceptive actions: Predictive or prospective control? , 2009, Journal of sports sciences.

[4]  John P. Wann,et al.  Why you should look where you are going , 2000, Nature Neuroscience.

[5]  Max Mulder,et al.  Effects of Preview Time in Manual Tracking Tasks , 2018, IEEE Transactions on Human-Machine Systems.

[6]  Li Li,et al.  Manual tracking enhances smooth pursuit eye movements. , 2015, Journal of vision.

[7]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[8]  Keith Rayner,et al.  Parafoveal processing in reading , 2011, Attention, Perception, & Psychophysics.

[9]  Masayoshi Tomizuka,et al.  The Human Operator in Manual Preview Tracking (an Experiment and Its Modeling Via Optimal Control) , 1976 .

[10]  Max Mulder,et al.  Effects of Preview on Human Control Behavior in Tracking Tasks With Various Controlled Elements , 2018, IEEE Transactions on Cybernetics.

[11]  Riender Happee,et al.  Vertical field of view restriction in driver training: A simulator-based evaluation , 2014 .

[12]  Duane T. McRuer,et al.  A Review of Quasi-Linear Pilot Models , 1967 .

[13]  Kimihiko Nakano,et al.  Eye-Gaze Tracking Analysis of Driver Behavior While Interacting With Navigation Systems in an Urban Area , 2016, IEEE Transactions on Human-Machine Systems.

[14]  I. Rentschler,et al.  Peripheral vision and pattern recognition: a review. , 2011, Journal of vision.

[15]  D A Gordon,et al.  PERCEPTUAL BASIS OF VEHICLE GUIDANCE , 1966 .

[16]  David N. Lee,et al.  Where we look when we steer , 1994, Nature.

[17]  Koji Ito,et al.  Tracking behavior of human operators in preview control systems , 1975 .

[18]  A. Modjtahedzadeh,et al.  A control theoretic model of driver steering behavior , 1990, IEEE Control Systems Magazine.

[19]  René van Paassen,et al.  An Empirical Human Controller Model for Preview Tracking Tasks , 2016, IEEE Transactions on Cybernetics.

[20]  Charles C. MacAdam,et al.  Application of an Optimal Preview Control for Simulation of Closed-Loop Automobile Driving , 1981, IEEE Transactions on Systems, Man, and Cybernetics.

[21]  David A. Abbink,et al.  Manual Control Cybernetics: State-of-the-Art and Current Trends , 2018, IEEE Transactions on Human-Machine Systems.

[22]  David J. Cole,et al.  Bias-Free Identification of a Linear Model-Predictive Steering Controller From Measured Driver Steering Behavior , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[23]  B. Wandell Foundations of vision , 1995 .

[24]  Thomas B. Sheridan,et al.  Three Models of Preview Control , 1966 .