A Multi-Sensorial Hybrid Control for Robotic Manipulation in Human-Robot Workspaces

Autonomous manipulation in semi-structured environments where human operators can interact is an increasingly common task in robotic applications. This paper describes an intelligent multi-sensorial approach that solves this issue by providing a multi-robotic platform with a high degree of autonomy and the capability to perform complex tasks. The proposed sensorial system is composed of a hybrid visual servo control to efficiently guide the robot towards the object to be manipulated, an inertial motion capture system and an indoor localization system to avoid possible collisions between human operators and robots working in the same workspace, and a tactile sensor algorithm to correctly manipulate the object. The proposed controller employs the whole multi-sensorial system and combines the measurements of each one of the used sensors during two different phases considered in the robot task: a first phase where the robot approaches the object to be grasped, and a second phase of manipulation of the object. In both phases, the unexpected presence of humans is taken into account. This paper also presents the successful results obtained in several experimental setups which verify the validity of the proposed approach.

[1]  Joris De Schutter,et al.  Shared control in hybrid vision/force robotic servoing using the task frame , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Jeffrey C. Trinkle,et al.  A framework for planning dexterous manipulation , 1991, Proceedings. 1991 IEEE International Conference on Robotics and Automation.

[3]  Ying Wang,et al.  A Hybrid Visual Servo Controller for Robust Grasping by Wheeled Mobile Robots , 2010, IEEE/ASME Transactions on Mechatronics.

[4]  Gregory D. Hager,et al.  Dynamic sensor planning in visual servoing , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[5]  Suguru Arimoto,et al.  A New Feedback Method for Dynamic Control of Manipulators , 1981 .

[6]  M.S. de Queiroz,et al.  Vision-based nonlinear tracking controllers with uncertain robot-camera parameters , 1999, 1999 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (Cat. No.99TH8399).

[7]  Masahito Yashima Manipulation planning for object re-orientation based on randomized techniques , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[8]  Anis Sahbani,et al.  Dexterous manipulation planning using probabilistic roadmaps in continuous grasp subspaces , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Kazuo Tanie,et al.  Dextrous manipulation planning by grasp transformation , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[10]  Kamal K. Gupta,et al.  Planning quasi-static fingertip manipulations for reconfiguring objects , 1999, IEEE Trans. Robotics Autom..

[11]  Fernando Torres Medina,et al.  Modelling and simulation of a multi-fingered robotic hand for grasping tasks , 2010, 2010 11th International Conference on Control Automation Robotics & Vision.

[12]  Rüdiger Dillmann,et al.  Dexterous manipulation planning of objects with surface of revolution , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Jorge Pomares,et al.  Improving tracking trajectories with motion estimation , 2006, ICINCO-RA.

[14]  Vincenzo Lippiello,et al.  Position-Based Visual Servoing in Industrial Multirobot Cells Using a Hybrid Camera Configuration , 2007, IEEE Transactions on Robotics.

[15]  Joel W. Burdick,et al.  Stratified motion planning with application to robotic finger gaiting , 1999 .

[16]  Seth Hutchinson,et al.  Visual Servo Control Part I: Basic Approaches , 2006 .

[17]  Rafael Kelly,et al.  Robust asymptotically stable visual servoing of planar robots , 1996, IEEE Trans. Robotics Autom..

[18]  Rafael Kelly,et al.  Analysis and Experimentation of Transpose Jacobian-based Cartesian Regulators , 1999, Robotica.

[19]  Makoto Kaneko,et al.  Robot Hands , 2008, Springer Handbook of Robotics.

[20]  Fernando Torres Medina,et al.  Sensor data integration for indoor human tracking , 2010, Robotics Auton. Syst..

[21]  Juan Antonio,et al.  Safe human-robot interaction based on multi-sensor fusion and dexterous manipulation planning , 2011 .

[22]  Liu Hsu,et al.  Hybrid Adaptive Vision—Force Control for Robot Manipulators Interacting with Unknown Surfaces , 2009, Int. J. Robotics Res..

[23]  John J. Craig,et al.  Hybrid position/force control of manipulators , 1981 .

[24]  François Chaumette,et al.  Visual servo control. I. Basic approaches , 2006, IEEE Robotics & Automation Magazine.

[25]  H. S. Wolff,et al.  iRun: Horizontal and Vertical Shape of a Region-Based Graph Compression , 2022, Sensors.

[26]  Patrick Rives,et al.  A new approach to visual servoing in robotics , 1992, IEEE Trans. Robotics Autom..

[27]  K. Ohnishi,et al.  Eye-to-hand approach on eye-in-hand configuration within real-time visual servoing , 2004, IEEE/ASME Transactions on Mechatronics.

[28]  Francisco A. Candelas,et al.  Safe human-robot interaction based on dynamic sphere-swept line bounding volumes , 2011 .

[29]  Yunhui Liu,et al.  Adaptive Visual Servoing Using Point and Line Features With an Uncalibrated Eye-in-Hand Camera , 2008, IEEE Transactions on Robotics.

[30]  William J. Wilson,et al.  Hybrid motion control and planning strategies for visual servoing , 2005, IEEE Transactions on Industrial Electronics.

[31]  Yasuhiro Masutani,et al.  Robustness of sensory feedback control based on imperfect Jacobian , 1991 .

[32]  Anis Sahbani,et al.  An overview of 3D object grasp synthesis algorithms , 2012, Robotics Auton. Syst..