Integration and Control of Reactive Visual Processes 1

This paper describes a new approach to the integration and control of continuously operating visual processes. Visual processes are expressed as transformations which map signals from virtual sensors into commands for devices. These transformations define reactive processes which tightly couple perception and action. Such transformations may be used to control robotic devices, including fixation an active binocular head, as well as the to select and control the processes which interpret visual data. This method takes inspiration from so-called "behavioural" approaches to mobility and manipulation. However, unlike most previous work, we define reactive transformations at the level of virtual sensors and device controllers. This permits a system to integrate a large number of perceptual processes and to dynamically compose sequences of such processes to perform visual tasks. The transition between visual processes is mediated by signals from a supervisory controller as well as signals obtained from perception. This method offers the possibility of constructing vision systems with large numbers of visual abilities in a manner which is both scalable and learnable. After a review of related work in mobility and manipulation, we adapt the reactive process framework to computer vision. We define reactive visual processes which map information from virtual sensors to device commands. We discuss the selection and control of reactive visual processes to accomplish visual tasks. We then illustrate this approach with a system which detects and fixates on different classes of moving objects.

[1]  Yiannis Aloimonos,et al.  Active vision , 2004, International Journal of Computer Vision.

[2]  James L. Crowley,et al.  Vision as Process , 1995 .

[3]  Patrick Reignier,et al.  Fuzzy logic techniques for mobile robot obstacle avoidance , 1994, Robotics Auton. Syst..

[4]  Ruzena Bajcsy,et al.  Discrete Event Systems for autonomous mobile agents , 1994, Robotics Auton. Syst..

[5]  Leslie Pack Kaelbling,et al.  Learning in embedded systems , 1993 .

[6]  Jan-Olof Eklundh,et al.  A head-eye system - Analysis and design , 1992, CVGIP Image Underst..

[7]  James L. Crowley,et al.  Towards Continuously Operating Integrated Vision Systems for Robotics Applications , 1992 .

[8]  Jana Kosecka,et al.  Control of Discrete Event Systems , 1992 .

[9]  James L. Crowley,et al.  Asynchronous control of rotation and translation for a robot vehicle , 1992, Robotics Auton. Syst..

[10]  Dana H. Ballard,et al.  Animate Vision , 1991, Artif. Intell..

[11]  Ralf Kories,et al.  Stereo Ranging with Verging Cameras , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Richard S. Sutton,et al.  Integrated Architectures for Learning, Planning, and Reacting Based on Approximating Dynamic Programming , 1990, ML.

[13]  Andrew G. Barto,et al.  An approach to learning control surfaces by connectionist systems , 1990 .

[14]  Dean A. Pomerleau,et al.  Neural Network Based Autonomous Navigation , 1990 .

[15]  Hans Knutsson,et al.  Focus of attention control , 1990 .

[16]  R. Bajcsy Active perception , 1988, Proc. IEEE.

[17]  James L. Crowley,et al.  Coordinaton of Action and Perception in a Surveillance Robot , 1987, IEEE Expert.

[18]  John K. Tsotsos Representational axes and temporal cooperative processes , 1987 .

[19]  Rodney A. Brooks,et al.  A Robust Layered Control Syste For A Mobile Robot , 2022 .