Robust Geometric Algorithms for Sensor Planning

We consider the problem of planning sensor strategies that enable a sensor to be automatically con gured for robot tasks. In this paper we present robust and e cient algorithms for computing the regions from which a sensor has unobstructed or partially obstructed views of a target in a goal. We apply these algorithms to the Error Detection and Recovery problem of recognizing whether a goal or failure region has been achieved. Based on these methods and strategies for visually-cued camera control, we have built a robot surveillance system in which one mobile robot navigates to a viewing position from which it has an unobstructed view of a goal region, and then uses visual recognition to detect when a speci c target has entered the room.

[1]  Pradeep K. Khosla,et al.  Integrating Sensor Placement and Visual Tracking Strategies , 1993, ISER.

[2]  Stephen J. Buckley Planning and teaching compliant motion strategies , 1987 .

[3]  Peter Kovesi,et al.  Automatic Sensor Placement from Vision Task Requirements , 1988, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Bruce Randall Donald,et al.  Error Detection and Recovery in Robotics , 1989, Lecture Notes in Computer Science.

[5]  Micha Sharir,et al.  Separating two simple polygons by a sequence of translations , 2015, Discret. Comput. Geom..

[6]  J. O'Rourke Art gallery theorems and algorithms , 1987 .

[7]  David G. Kirkpatrick,et al.  Determining sector visibility of a polygon , 1989, SCG '89.

[8]  Emo WELZL,et al.  Constructing the Visibility Graph for n-Line Segments in O(n²) Time , 1985, Inf. Process. Lett..

[9]  Konstantinos A. Tarabanis,et al.  Computing Occlusion-Free Viewpoints , 1996, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Hong Zhang,et al.  Optimal sensor placement , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[11]  W. J. Rucklidge E?cient Computation of the Minimum Hausdorfi Distance for Visual Recognition , 1994 .

[12]  Leonidas J. Guibas,et al.  The Robot Localization Problem , 1995, SIAM J. Comput..

[13]  Vladimir J. Lumelsky,et al.  Provable strategies for vision-guided exploration in three dimensions , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[14]  Seth Hutchinson Exploiting visual constraints in robot motion planning , 1991, Proceedings. 1991 IEEE International Conference on Robotics and Automation.

[15]  Rajeev Sharma,et al.  On the observability of robot motion under active camera control , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[16]  Hugh F. Durrant-Whyte,et al.  A Bayesian Approach to Optimal Sensor Placement , 1990, Int. J. Robotics Res..

[17]  John F. Canny,et al.  A Computational Approach to Edge Detection , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Malik Ghallab,et al.  Perception planning for a multi-sensory interpretation machine , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[19]  Gregory D. Hager,et al.  Computational Methods for Task-directed Sensor Data Fusion and Sensor Planning , 1991, Int. J. Robotics Res..

[20]  Micha Sharir,et al.  Fat Triangles Determine Linearly Many Holes , 1994, SIAM J. Comput..

[21]  Daniel P. Huttenlocher,et al.  Visually-guided navigation by comparing two-dimensional edge images , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[22]  Seth J. Teller,et al.  Computing the antipenumbra of an area light source , 1992, SIGGRAPH.

[23]  Seth Hutchinson,et al.  Hybrid vision/position servo control of a robotic manipulator , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[24]  Joseph O'Rourke,et al.  Worst-case optimal algorithms for constructing visibility polygons with holes , 1986, SCG '86.

[25]  Jack Sklansky,et al.  Finding the convex hull of a simple polygon , 1982, Pattern Recognit. Lett..