Belief-Driven Manipulator Visual Servoing for Less Controlled Environments

This paper presents the architecture of a feedforward manipulator control strategy based on a belief function that may be appropriate for less controlled environments. In this architecture, the belief about the environmental state, as described by a probability density function, is maintained by a recursive Bayesian estimation process. A likelihood is derived from each observation regardless of whether the targeted features of the environmental state have been detected or not. This provides continuously evolving information to the controller and allows an inaccurate belief to evolve into an accurate belief. Control actions are determined by maximizing objective functions using non-linear optimization. Forward models are used to transform control actions to a predicted state so that objective functions may be expressed in task space. The first set of examples numerically investigates the validity of the proposed strategy by demonstrating control in a two dimensional scenario. Then a more realistic application is presented where a robotic manipulator executes a searching and tracking task using an eye-in-hand vision sensor.

[1]  Nils J. Nilsson,et al.  A Formal Basis for the Heuristic Determination of Minimum Cost Paths , 1968, IEEE Trans. Syst. Sci. Cybern..

[2]  Richard O. Duda,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[3]  T. Corwin A representation for the posterior distribution of the location of a moving target , 1979 .

[4]  J. Berger Statistical Decision Theory and Bayesian Analysis , 1988 .

[5]  Koichi Kondo,et al.  Motion planning with six degrees of freedom by multistrategic bidirectional heuristic free-space enumeration , 1991, IEEE Trans. Robotics Autom..

[6]  Bum Hee Lee,et al.  An approach to robot motion analysis and planning for conveyor tracking , 1992, IEEE Trans. Syst. Man Cybern..

[7]  Peter K. Allen,et al.  Trajectory filtering and prediction for automated tracking and grasping of a moving object , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[8]  Giorgio C. Buttazzo,et al.  Mousebuster: a robot system for catching fast moving objects by vision , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[9]  Peter Corke,et al.  Dynamic Issues in Robot Visual-Servo Systems , 1996 .

[10]  Frank P. Ferrie,et al.  Active recognition: using uncertainty to reduce ambiguity , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[11]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[12]  Beno Benhabib,et al.  Optimal rendezvous-point selection for robotic interception of moving objects , 1998, IEEE Trans. Syst. Man Cybern. Part B.

[13]  Alex Pentland,et al.  Probabilistic object recognition and localization , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[14]  Ping-Sing Tsai,et al.  Shape from Shading: A Survey , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[15]  Pierre Bessière,et al.  A robotic CAD system using a Bayesian framework , 2000, Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).

[16]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[17]  Steven M. LaValle,et al.  From Dynamic Programming to RRTs: Algorithmic Design of Feasible Trajectories , 2003, Control Problems in Robotics.

[18]  Andrew Zisserman,et al.  Multiple View Geometry in Computer Vision (2nd ed) , 2003 .

[19]  E. J. van Henten,et al.  Collision-free Motion Planning for a Cucumber Picking Robot , 2003 .

[20]  Roland Siegwart,et al.  Bayesian Modeling and Reasoning for Real World Robotics: Basics and Examples , 2004, Embodied Artificial Intelligence.

[21]  Sebastian Thrun,et al.  Locating moving entities in indoor environments with teams of mobile robots , 2003, AAMAS '03.

[22]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[23]  Subhashis Banerjee,et al.  Active recognition through next view planning: a survey , 2004, Pattern Recognit..

[24]  Beno Benhabib,et al.  On-Line Robotic Interception Planning Using a Rendezvous-Guidance Technique , 2004, J. Intell. Robotic Syst..

[25]  Jiebo Luo,et al.  Improved scene classification using efficient low-level features and semantic cues , 2004, Pattern Recognit..

[26]  Jing Huang,et al.  Spatial Color Indexing and Applications , 2004, International Journal of Computer Vision.

[27]  Hugh F. Durrant-Whyte,et al.  Process model, constraints, and the coordinated search strategy , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[28]  Hugh F. Durrant-Whyte,et al.  Natural landmark-based autonomous vehicle navigation , 2004, Robotics Auton. Syst..

[29]  Patrick Pérez,et al.  Data fusion for visual tracking with particles , 2004, Proceedings of the IEEE.

[30]  Sebastian Thrun,et al.  Anytime Dynamic A*: An Anytime, Replanning Algorithm , 2005, ICAPS.

[31]  Steven Dubowsky,et al.  Efficient Information-based Visual Robotic Mapping in Unstructured Environments , 2005, Int. J. Robotics Res..

[32]  B. Upcroft,et al.  A stochastic model for natural feature representation , 2005, 2005 7th International Conference on Information Fusion.

[33]  Tomonari Furukawa,et al.  Belief Driven Manipulator Control for Integrated Searching and Tracking , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[34]  Paul Y. Oh,et al.  Enhancing Camera Operator Performance with Computer Vision Based Control , 2007 .

[35]  Hugh F. Durrant-Whyte,et al.  Simultaneous Localization, Mapping and Moving Object Tracking , 2007, Int. J. Robotics Res..

[36]  ierre,et al.  Bayesian Robot Programming , 2022 .