Autonomous Precision Pouring From Unknown Containers

We autonomously pour from unknown symmetric containers found in a typical wet laboratory for the development of a robot-assisted, rapid experiment preparation system. The robot estimates the pouring container symmetric geometry, then leverages simulated pours as priors for a given fluid to pour precisely and quickly in a single attempt. The fluid is detected in the transparent receiving container by combining weight and vision. The change of volume in the receiver is a function of the geometry of the pouring container, the pouring angle, and rate. To determine the volumetric flow rate, the profile for maximum containable volume for a given angle is estimated along with the time delay of the fluid exiting the container. A trapezoidal trajectory generation algorithm prescribes the desired volumetric flow rate as a function of the estimation accuracy. A hybrid control strategy is then used to attenuate volumetric error. Three methods are compared for estimating the volume-angle profile, and it is shown that a combination of online system identification and leveraged model priors results in reliable performance. The major contributions of this work are a system capable of pouring quickly and precisely from varying symmetric containers in a single attempt with limited priors, and a novel fluid detection method. This system is implemented on the Rethink Robotics Sawyer and KUKA LBR iiwa manipulators.

[1]  Connor Schenck,et al.  Visual closed-loop control for pouring liquids , 2016, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Ales Ude,et al.  Learning to pour with a robot arm combining goal and shape learning for dynamic movement primitives , 2011, Robotics Auton. Syst..

[3]  Oliver Kroemer,et al.  A kernel-based approach to direct action perception , 2012, 2012 IEEE International Conference on Robotics and Automation.

[4]  Francisco Ramos,et al.  Time-Optimal Online Trajectory Generator for Robotic Manipulators , 2013 .

[5]  K. Terashima,et al.  Simplified flow rate estimation by decentralization of Kalman filters in automatic pouring robot , 2012, 2012 Proceedings of SICE Annual Conference (SICE).

[6]  Christopher G. Atkeson,et al.  Stereo vision of liquid and particle flow for robot pouring , 2016, 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids).

[7]  Vijay Kumar,et al.  Precise dispensing of liquids using visual feedback , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[8]  Krishnanand N. Kaipa,et al.  Selection of trajectory parameters for dynamic pouring tasks based on exploitation-driven updates of local metamodels , 2017, Robotica.

[9]  Carme Torras,et al.  Force-based robot learning of pouring skills using parametric hidden Markov models , 2013, 9th International Workshop on Robot Motion and Control.

[10]  Connor Schenck,et al.  Towards Learning to Perceive and Reason About Liquids , 2016, ISER.

[11]  Michael U. Gutmann,et al.  Adaptable Pouring: Teaching Robots Not to Spill using Fast but Approximate Fluid Simulation , 2017, CoRL.

[12]  Oliver Kroemer,et al.  Generalizing pouring actions between objects using warped parameters , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[13]  Tae-Yong Kim,et al.  Unified particle physics for real-time applications , 2014, ACM Trans. Graph..

[14]  Yoshiyuki Noda,et al.  High-precision pouring control using online model parameters identification in automatic pouring robot with cylindrical ladle , 2014, 2014 IEEE International Conference on Systems, Man, and Cybernetics (SMC).