Estimating an Articulated Tool's Kinematics via Visuo-Tactile Based Robotic Interactive Manipulation

The usage of articulated tools for autonomous robots is still a challenging task. One of the difficulties is to automatically estimate the tool's kinematics model. This model cannot be obtained from a single passive observation, because some information, such as a rotation axis (hinge), can only be detected when the tool is being used. Inspired by a baby using its hands while playing with an articulated toy, we employ a dual arm robotic setup and propose an interactive manipulation strategy based on visual-tactile servoing to estimate the tool's kinematics model. In our proposed method, one hand is holding the tool's handle stably, and the other arm equipped with tactile finger flips the movable part of the articulated tool. An innovative visuo-tactile servoing controller is introduced to implement the flipping task by integrating the vision and tactile feedback in a compact control loop. In order to deal with the temporary invisibility of the movable part in camera, a data fusion method which integrates the visual measurement of the movable part and the fingertip's motion trajectory is used to optimally estimate the orientation of the tool's movable part. The important tool's kinematic parameters are estimated by geometric calculations while the movable part is flipped by the finger. We evaluate our method by flipping a pivoting cleaning head (flap) of a wiper and estimating the wiper's kinematic parameters. We demonstrate that the flap of the wiper is flipped robustly, even the flap is shortly invisible. The orientation of the flap is tracked well compared to the ground truth data. The kinematic parameters of the wiper are estimated correctly.

[1]  Danica Kragic,et al.  Online contact point estimation for uncalibrated tool use , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[2]  C. Kemp,et al.  Robot Manipulation of Human Tools : Autonomous Detection and Control of Task Relevant Features , 2006 .

[3]  A. Maravita,et al.  Tools for the body (schema) , 2004, Trends in Cognitive Sciences.

[4]  Ronald C. Arkin,et al.  Robot tool behavior: a developmental approach to autonomous tool use , 2007 .

[5]  Jorge Pomares,et al.  Control Framework for Dexterous Manipulation Using Dynamic Visual Servoing and Tactile Sensors' Feedback , 2014, Sensors.

[6]  Yiannis Aloimonos,et al.  Affordance detection of tool parts from geometric features , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[7]  Yasuo Kuniyoshi,et al.  Adaptive body schema for robotic tool-use , 2006, Adv. Robotics.

[8]  Helge J. Ritter,et al.  Realtime 3D segmentation for human-robot interaction , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Oliver Brock,et al.  Online interactive perception of articulated objects with multi-level recursive estimation based on task-specific priors , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Mario Fritz,et al.  Teaching robots the use of human tools from demonstration with non-dexterous end-effectors , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[11]  Heinz Wörn,et al.  Opening a door with a humanoid robot using multi-sensory tactile feedback , 2008, 2008 IEEE International Conference on Robotics and Automation.

[12]  Ashutosh Saxena,et al.  Robotic Grasping of Novel Objects using Vision , 2008, Int. J. Robotics Res..

[13]  Stefan Schaal,et al.  Probabilistic Articulated Real-Time Tracking for Robot Manipulation , 2016, IEEE Robotics and Automation Letters.

[14]  Oliver Brock,et al.  Extracting Planar Kinematic Models Using Interactive Perception , 2008 .

[15]  Charles C. Kemp,et al.  Visual Tool Tip Detection and Position Estimation for Robotic Manipulation of Unknown Human Tools , 2005 .

[16]  Robert Haschke Grasping and Manipulation of Unknown Objects Based on Visual and Tactile Feedback , 2015 .

[17]  Tetsuya Ogata,et al.  Tool-Body Assimilation Model Based on Body Babbling and Neurodynamical System , 2015 .

[18]  Jörg Stückler,et al.  Adaptive tool-use strategies for anthropomorphic service robots , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[19]  Helge J. Ritter,et al.  A visuo-tactile control framework for manipulation and exploration of unknown objects , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[20]  Helge J. Ritter,et al.  A highly sensitive 3D-shaped tactile sensor , 2013, 2013 IEEE/ASME International Conference on Advanced Intelligent Mechatronics.

[21]  Angel P. del Pobil,et al.  Vision-tactile-force integration and robot physical interaction , 2009, 2009 IEEE International Conference on Robotics and Automation.

[22]  Helge J. Ritter,et al.  A Control Framework for Tactile Servoing , 2013, Robotics: Science and Systems.

[23]  Yuki Suga,et al.  Tool-body assimilation model using a neuro-dynamical system for acquiring representation of tool function and motion , 2014, 2014 IEEE/ASME International Conference on Advanced Intelligent Mechatronics.

[24]  Alejandro Hernández Arieta,et al.  Body Schema in Robotics: A Review , 2010, IEEE Transactions on Autonomous Mental Development.

[25]  Helge J. Ritter,et al.  Learning a tool's homogeneous transformation by tactile-based interaction , 2016, 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids).

[26]  Wolfram Burgard,et al.  A Probabilistic Framework for Learning Kinematic Models of Articulated Objects , 2011, J. Artif. Intell. Res..

[27]  G. Metta,et al.  Exploring affordances and tool use on the iCub , 2013, 2013 13th IEEE-RAS International Conference on Humanoid Robots (Humanoids).