A Neural Model of the Fly Visual System Applied to Navigational Tasks

We investigate how elementary motion detectors (EMDs) can be used to control behavior. We have developed a model of the fly visual system which operates in real time under real world conditions and was tested in course and altitude stabilization tasks using a flying robot. While the robot could stabilize gaze i.e. orientation, we found that stabilizing translational movements requires more elaborate preprocessing of the visual input and fine tuning of the EMDs. Our results show that in order to control gaze and altitude EMD information needs to be computed in different processing streams.