Videopanorama Frame Rate Requirements Derived from Visual Discrimination of Deceleration During Simulated Aircraft Landing

In order to determine the required visual frame rate (FR) for minimizing prediction errors with out-the-window video displays at remote/virtual airport towers, 13 active air-traffic controllers viewed high dynamic fidelity simulations of landing aircraft and decided whether aircraft would stop as if to be able to make a turnoff or whether a runway excursion would be expected. The viewing conditions and simulation dynamics replicated visual rates and environments of transport aircraft landing at small commercial airports. The required frame rate was estimated using Bayes inference on prediction errors by linear FR-extrapolation of event probabilities conditional on predictions (stop, no-stop). Furthermore, estimates were obtained from exponential model fits to the parametric and nonparametric perceptual discriminabilities d′ and A (average area under ROC curves) as dependent on FR. Decision errors are biased towards preference of overshoot and appear due to illusionary increase in speed at low frame rates. Both Bayes and A-extrapolations yield a frame rate requirement of 35 40 Hz for minimizing decision errors. Definitive recommendations require further experiments with FR > 30 Hz.

[1]  Norbert Fürstenau,et al.  Development of an Augmented Vision Video Panorama Human-Machine Interface for Remote Airport Tower Operation , 2007, HCI.

[2]  Edward Awh,et al.  Precision in Visual Working Memory Reaches a Stable Plateau When Individual Item Limits Are Exceeded , 2011, The Journal of Neuroscience.

[3]  Kajal T. Claypool,et al.  On frame rate and player performance in first person shooter games , 2007, Multimedia Systems.

[4]  F. Ashby A biased random walk model for two choice reaction times , 1983 .

[5]  Stephen R. Ellis,et al.  Determination of Frame Rate Requirements for Videopanorama-based Virtual Towers using Visual Discrimination of Deceleration during Simulated Aircraft Landing: alternative analysis , 2011 .

[6]  Norbert Fürstenau,et al.  Steps towards the Virtual Tower: Remote Airport Traffic Control Center (RAiCe) , 2009 .

[7]  S. Kohn,et al.  Visual Field Information in Low-Altitude Visual Flight by Line-of-Sight Slaved Helmet-Mounted Displays , 1994, IEEE Trans. Syst. Man Cybern. Syst..

[8]  Ifip,et al.  11th IFAC/IFIP/IFORS/IEA Symposium on Analysis Design, and Evaluation of Human-Machine Systems 2010 : Valenciennes, France 31 August-3 September 2010 , 2012 .

[9]  F. J. Van Schaik,et al.  Assessment of visual cues by tower controllers, with implications for a remote tower control centre , 2010, IFAC HMS.

[10]  John A. Perrone,et al.  A visual motion sensor based on the properties of V1 and MT neurons , 2004, Vision Research.

[11]  Paul M Bays,et al.  Temporal dynamics of encoding, storage, and reallocation of visual working memory. , 2011, Journal of vision.

[12]  Stephen R. Ellis,et al.  Frame Rate Effects on Visual Discrimination of Landing Aircraft Deceleration , 2011 .

[13]  Daniel J Hannon Feasibility evaluation of a staffed virtual tower , 2008 .

[14]  F. J. Van Schaik,et al.  Advanced remote tower project validation results , 2010, IFAC HMS.

[15]  Ellen Salud,et al.  Predicting Visibility of Aircraft , 2009, PloS one.

[16]  Stephen R. Ellis,et al.  Static and Motion-Based Visual Features Used by Airport Tower Controllers: Some Implications for the Design of Remote or Virtual Towers , 2011 .

[17]  Shane T. Mueller,et al.  A note on ROC analysis and non-parametric estimate of sensitivity , 2005 .

[18]  Stephen R. Ellis,et al.  Visual features involving motion seen from airport control towers , 2010, IFAC HMS.