Extracting Visibility Information by Following Walls

Summary. This paper presents an analysis of a simple robot model, called Bitbot. The Bitbot has limited capabilities; it can reliably follow walls and sense a contact with a wall. Although the Bitbot does not have a range sensor or a camera, it is able to acquire visibility information from the environment, which is then used to solve a pursuit-evasion task. Our developments are centered on the characterization of the information the Bitbot acquires. At any given moment, due to the sensing uncertainty, the robot does not know the current state. In general, uncertainty in the state is one of the central issues in robotics; therefore, the Bitbot model serves as an example of how the notion of information space naturally handles uncertainty. We show that state estimation with the Bitbot is a challenging problem, related to the well-known open problem of characterizing visibility graphs in computational geometry. However, state estimation becomes unnecessary to the achievement of the Bitbot’s visibility tasks. We show how pursuit-evasion strategy is derived from a careful manipulation with histories of observations, and present analysis of the algorithm and experimental results.

[1]  Steven M. LaValle,et al.  Gap Navigation Trees: Minimal Representation for Visibility-based Tasks , 2004, WAFR.

[2]  J. van Leeuwen,et al.  Discrete and Computational Geometry , 2002, Lecture Notes in Computer Science.

[3]  Jason M. O'Kane,et al.  Global localization using odometry , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[4]  Jason M. O'Kane,et al.  Almost-Sensorless Localization , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[5]  Gregory Dudek,et al.  Vision-based robot localization without explicit object models , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[6]  Leonidas J. Guibas,et al.  Visibility-Based Pursuit-Evasion in a Polygonal Environment , 1997, WADS.

[7]  Matthew T. Mason,et al.  Mechanical parts orienting: The case of a polyhedron on a table , 2005, Algorithmica.

[8]  Kenneth Y. Goldberg,et al.  Orienting polygonal parts without sensors , 1993, Algorithmica.

[9]  Steven M. LaValle,et al.  Bitbots: Simple Robots Solving Complex Tasks , 2005, AAAI.

[10]  Kevin M. Lynch Sensorless parts feeding with a one joint robot , 1996 .

[11]  Steven M. LaValle,et al.  Planning algorithms , 2006 .

[12]  M. Yamashita,et al.  On-Line Polygon Search by a Six-State Boundary 1-Searcher , 2003 .

[13]  Kenneth Y. Goldberg,et al.  Bayesian grasping , 1990, Proceedings., IEEE International Conference on Robotics and Automation.

[14]  Matthew T. Mason,et al.  An exploration of sensorless manipulation , 1986, IEEE J. Robotics Autom..