A principled theory of sensing and action is crucial in developing task-level programming for autonomous mobile robots. A framework for such a theory is proposed, providing both a precise vocabulary and also appropriate computational machinery for working with issues of information flow in and through a robot system equipped with various types of sensors and operating in a dynamic unstructured environment. The authors focus on the problem of constructing virtual sensors out of concrete sensors. Virtual sensors may be defined in terms of existing concrete sensors. A method of task-directed construction of such virtual sensors is described. Virtual sensors are queried in robot programs much as their concrete counterparts are. In allowing the task to direct the composition of virtual sensors, robot programs which are organized in such a way as to guide the robot toward acquiring the information it needs to accomplish the task can be derived. Many information-acquisition and representational issues are made explicit.<<ETX>>
[1]
Michael A. Erdmann,et al.
On probabilistic strategies for robot tasks
,
1989
.
[2]
Russell H. Taylor,et al.
Automatic Synthesis of Fine-Motion Strategies for Robots
,
1984
.
[3]
Bruce Randall Donald,et al.
Sensor interpretation and task-directed planning using perceptual equivalence classes
,
1991,
Proceedings. 1991 IEEE International Conference on Robotics and Automation.
[4]
Michael A. Erdmann,et al.
Using Backprojections for Fine Motion Planning with Uncertainty
,
1986
.
[5]
Bruce Randall Donald,et al.
Program mobile robots in Scheme
,
1992,
Proceedings 1992 IEEE International Conference on Robotics and Automation.