Sensor data fusion through a distributed blackboard

Sensor data fusion takes many forms and has diverse purposes. Sensor data fusion can improve a complex robot's picture of the world and increase its confidence in the truth of that picture. A computing architecture which supports many different schemes for fusing sensor data is necessary to support the activities of complex robots. One such architecture is the distributed blackboard mechanism implemented onboard the USMC Ground Surveillance Robot (GSR) [1]. The distributed blackboard has proven to be a very useful and flexible mechanism through which to accomplish effective sensor data fusion.

[1]  Rodney A. Brooks,et al.  Visual map making for a mobile robot , 1985, Proceedings. 1985 IEEE International Conference on Robotics and Automation.

[2]  Matthew Turk,et al.  The Autonomous Land Vehicle (ALV) Preliminary Road-Following Demonstration , 1985, Other Conferences.

[3]  Anita M. Flynn,et al.  Redundant Sensors for Mobile Robot Navigation , 1985 .

[4]  Jean-Paul Laumond,et al.  Position referencing and consistent world modeling for mobile robots , 1985, Proceedings. 1985 IEEE International Conference on Robotics and Automation.

[5]  Ruzena Bajcsy,et al.  Object Recognition Using Vision and Touch , 1985, IJCAI.

[6]  Marilyn Nashman,et al.  Real-time cooperative interaction between structured-light and reflectance ranging for robot guidance , 1985, Robotica.

[7]  James L. Crowley,et al.  Navigation for an intelligent mobile robot , 1985, IEEE J. Robotics Autom..