Data-driven Segmentation of Grey-level Images with Coupled Nonlinear Oscillators

We extend a model for performing data-driven segmentation of grey-level images, which is able to integrate information from multiple visual modalities. It is based on stacked two-dimensional sheets of nonlinear oscillators, which interact in ways re ecting the Gestalt principles of similarity and neighbourhood. Segmentation is expressed as temporal correlation of oscillators' activities, which is high within and low between segments. The oscillators in one segment synchronize through their excitatory interactions as de ned by the sensory input, while inhibitory interneurons desynchronize groups of oscillators belonging to distinct segments. As previously presented [1], the model had global interactions, and its function was demonstrated on one arti cial example. We here introduce localized connectivity and show how to code feature input. Experiments show that, using grey level and movement as modalitites, the system is able to segment images that have quite di erent characteristics. The resulting segmentation re ects information obtained from both modalitites, i.e., it could not have been predicted from analysing any one modality alone.