Navigation with a tiny brain: Getting home without knowing where you are
暂无分享,去创建一个
The use of visual information for navigation is a universal strategy for sighted animals, amongst whom social insects are particular experts. The general interest in studies of insect navigation is in part due to their small brains; biomimetic engineers can take inspiration from elegant and parsimonious control solutions, while biologists look for a description of the minimal cognitive requirements for complex spatial behaviours. We take an interdisciplinary approach to studying visual guided navigation by combining behavioural experiments with modelling and robotics to understand how complex behaviour can emerge from the combination of a simple sensory system and brain, interacting with innate behaviours all tuned to the natural habitat. In so doing, we show that an agent can robustly navigate without ever knowing where it is, without specifying when or what it should learn, nor requiring it to recognise specific objects, places routes or maps. This leads to an algorithm in which navigation is driven by familiarity detection rather than explicit recall, with sensory data specifying actions not locations. Route navigation is thus recast as a search for familiar views, allowing an agent to encode routes through visually complex worlds in a single layer neural network after a single training run. We suggest that this work is a specific example of a more general idea which has implications for engineers seeking nature-inspired solutions: By considering how animals directly acquire and use task-specific information through specialised sensors, brains and behaviours, we can solve complex problems without complex processing.