Sensor Integration and Task Planning for Mobile Manipulation

Robotic mobile manipulation in unstructured environments requires integration of a number of key reasearch areas such as localization, navigation, object recognition, visual tracking/servoing, grasping and object manipulation. It has been demonstrated that, given the above, and through simple sequencing of basic skills, a robust system can be designed, [19]. In order to provide the robustness and flexibility required of the overall robotic system in unstructured and dynamic everyday environments, it is important to consider a wide range of individual skills using different sensory modalities. In this work, we consider a combination of deliberative and reactive control together with the use of multiple sensory modalities for modeling and execution of manipulation tasks. Special consideration is given to the design of a vision system necessary for object recognition and scene segmentation as well as learning principles in terms of grasping.

[1]  Dorin Comaniciu,et al.  Real-time tracking of non-rigid objects using mean shift , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).

[2]  Danica Kragic,et al.  Combination of foveal and peripheral vision for object recognition and pose estimation , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[3]  Lars Petersson,et al.  DCA: a distributed control architecture for robotics , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[4]  Mark R. Cutkosky,et al.  On grasp choice, grasp models, and the design of hands for manufacturing tasks , 1989, IEEE Trans. Robotics Autom..

[5]  Randy H. Katz,et al.  Contemporary Logic Design , 2004 .

[6]  Gregory D. Hager,et al.  Task modeling and specification for modular sensory based human-machine cooperative systems , 2003, IROS.

[7]  Lars Petersson,et al.  Systems integration for real-world manipulation tasks , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[8]  David G. Lowe,et al.  Object recognition from local scale-invariant features , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[9]  Danica Kragic,et al.  Interactive grasp learning based on human demonstration , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[10]  Gregory D. Hager,et al.  Human-Machine Collaborative Systems for Microsurgical Applications , 2005, Int. J. Robotics Res..

[11]  Ronald C. Arkin,et al.  An Behavior-based Robotics , 1998 .

[12]  Danica Kragic,et al.  Weak models and cue integration for real-time tracking , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[13]  Lars Petersson,et al.  Visually guided manipulation tasks , 2002, Robotics Auton. Syst..

[14]  M. Dorigo,et al.  Intelligent Robots and Autonomous Agents , 2002 .

[15]  Danica Kragic,et al.  Real-time tracking meets online grasp planning , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[16]  Rüdiger Dillmann,et al.  Interactive Robot Programming Based on Human Demonstration and Advice , 1998, Sensor Based Intelligent Robots.

[17]  E. Malis,et al.  2D 1/2 Visual Servoing , 1998 .

[18]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..