Flight-control and navigation systems inspired by the structure and function of the visual system and brain of insects have been proposed for a class of developmental miniature robotic aircraft called “biomorphic flyers” described earlier in “Development of Biomorphic Flyers” (NPO-30554), NASA Tech Briefs, Vol. 28, No. 11 (November 2004), page 54. These form a subset of biomorphic explorers, which, as reported in several articles in past issues of NASA Tech Briefs [“Biomorphic Explorers” (NPO-20142), Vol. 22, No. 9 (September 1998), page 71; “Bio-Inspired Engineering of Exploration Systems” (NPO-21142), Vol. 27, No. 5 (May 2003), page 54; and “Cooperative Lander-Surface/Aerial Microflyer Missions for Mars Exploration” (NPO-30286), Vol. 28, No. 5 (May 2004), page 36], are proposed small robots, equipped with microsensors and communication systems, that would incorporate crucial functions of mobility, adaptability, and even cooperative behavior. These functions are inherent to biological organisms but are challenging frontiers for technical systems. Biomorphic flyers could be used on Earth or remote planets to explore otherwise difficult or impossible to reach sites. An example of an exploratory task of search/surveillance functions currently being tested is to obtain high-resolution aerial imagery, using a variety of miniaturized electronic cameras. The control functions to be implemented by the systems in development include holding altitude, avoiding hazards, following terrain, navigation by reference to recognizable terrain features, stabilization of flight, and smooth landing. Flying insects perform these and other functions remarkably well, even though insect brains contains fewer than 10 as many neurons as does the human brain. Although most insects have immobile, fixed-focus eyes and lack stereoscopy (and hence cannot perceive depth directly), they utilize a number of ingenious strategies for perceiving, and navigating in, three dimensions. Despite their lack of stereoscopy, insects infer distances to potential obstacles and other objects from image motion cues that result from their own motions in the environment. The concept of motion of texture in images as a source of motion cues is denoted generally as the concept of optic or optical flow. Computationally, a strategy based on optical flow is simpler than is stereoscopy for avoiding hazards and following terrain. Hence, this strategy offers the potential to design visionbased control computing subsystems that would be more compact, would weigh less, and would demand less power than would subsystems of equivalent capability based on a conventional stereoscopic approach. These control loops for stabilizing attitude and/or holding altitude would include optoelectronic ocelli and would be based partly on dragonfly ocelli-simple eyes that exist in addition to the better-known compound eyes of insects. In many insects the ocelli only detect changes in light intensity and have minimal observable effect on flight. In dragonflies, the ocelli play an important role in stabilizing attitude with respect to dorsal light levels. The control loops to be implemented would incorporate elements of both dragonfly ocellar functions and optical flow computation as derived from principles observed in honeybee flight. On Earth, bees use sky polarization patterns in the ultraviolet part of the spectrum as a direction reference relative to the position of the Sun. A robotic direction-finding technique based on this concept is more robust in comparison with a simple Sun compass because the ultraviolet polarization pattern is distributed across the entire sky on Earth and is redundant and hence can be extrapolated from a small region of clear sky in an elsewhere cloudy sky that hides the Sun. A bee tends to adjust its flight speed to maintain a constant optical flow (that is, 100 cm 12 cm Top View of Tapered Tunnel