Development of the Motion Perception Toolbox

The goal of this paper is to give an overview and some practical examples of the Motion Perception Toolbox developed by TNO (Netherlands Organization for Applied Research) as a freeware Matlab Simulink library. The MPT library provides a documented set of building blocks that model human perception and are easily coupled to existing simulation models. For example, when implemented behind an existing aircraft model the MPT library could be used to predict pilot motion perception, or the occurrence of motion illusions. Although the implementation in the brain is certainly not a set of transfer functions or differential equations, some of the global characteristics of human behaviour can be modelled as input-output relations by mathematical transfer functions that were identified in a long tradition of human perception research. We have tried to put together the most fundamental of these relations based on consensus in the literature and own insights. Examples of building blocks in the MPT library, explained and applied in this paper, are: 1) calculation of inertial head motion from vehicle and pilot motion to be used as sensory input, 2) transfer functions of the visual and vestibular system, as well as their interactions in velocity perception, and 3) a 3D animation tool to intuitively visualize perception output. The Motion Perception Toolbox offers engineers a starting point from which they can analyze the results of simulations (e.g. aircraft, cars, etc.) from a human perception and control point-of-view. The vast amounts of (ambiguous) experimental data, different physiological models, and expert opinions, seem to make it impossible to develop the universal Motion Perception Toolbox. With this in mind, we invite the AIAA community to actively participate in its development. To that end, the MPT is open source and can be downloaded from www.desdemona.eu. The simulation examples in this paper are incorporated as demos in the MPT and can also be downloaded.

[1]  M. Mulder,et al.  Parameterized Multi-loop Model of Pilot's Use of Central and Peripheral Visual Motion Cues , 2005 .

[2]  Ian P Howard,et al.  The Contribution of Motion, the Visual Frame, and Visual Polarity to Sensations of Body Tilt , 1994, Perception.

[3]  C. Folescu,et al.  DEVELOPMENT OF A SUSTAINABLE-G DYNAMIC FLIGHT SIMULATOR* , 2000 .

[4]  Max Mulder,et al.  Evaluation of Vestibular Thresholds for Motion Detection in the SIMONA Research Simulator , 2005 .

[5]  Ruud Hosman,et al.  Desdemona : Advanced disorientation trainer and (sustained-G) flight simulator , 2000 .

[6]  L. Jongkees,et al.  The mechanics of the semicircular canal , 1949, The Journal of physiology.

[7]  J. Goldberg,et al.  Physiology of peripheral neurons innervating otolith organs of the squirrel monkey. III. Response dynamics. , 1976, Journal of neurophysiology.

[8]  Ian P. Howard,et al.  Human visual orientation , 1982 .

[9]  Eric L Groen,et al.  Perception of Self-Tilt in a True and Illusory Vertical Plane , 2002, Perception.

[10]  Max Mulder,et al.  Motion Perception Thresholds in Flight Simulation , 2006 .

[11]  R Hosman,et al.  Pilot's perception in the control of aircraft motions. , 1998, Control engineering practice.

[12]  Ruud Hosman,et al.  Design & evaluation of spherical washout algorithm for Desdemona simulator , 2005 .

[13]  J. Goldberg,et al.  Physiology of peripheral neurons innervating otolith organs of the squirrel monkey. II. Directional selectivity and force-response relations. , 1976, Journal of neurophysiology.

[14]  Jelte E. Bos,et al.  Theoretical considerations on canal–otolith interaction and an observer model , 2002, Biological Cybernetics.

[15]  F.A.M. van der Steen Self-motion perception , 1998 .

[16]  Jelte E. Bos,et al.  Modeling human spatial orientation and motion perception , 2001 .

[17]  R. Fitzpatrick,et al.  The vestibular system , 2005, Current Biology.