Analysis of human arm movement for catching a moving object

In this paper, we analyze human arm movement for catching a moving object. In this experiment, two different types of catching a object motions are performed. One is simple catch movement which has no consideration to reduce the damage of the contact between the hand and the object. The other is more careful operation which is required to decrease the relative velocity sufficiently. In the former operation, roughly straight approaching trajectories to the object are observed and accurate positioning and timing to catch the object is most important. In these characteristics, no apparent changes are found in spite of increasing of the object speed. The latter is more complicated and divided into four phases, which are [Phase 1: Straight approach to the object], [Phase 2: Turning the hand-tip movement], [Phase 3: Accelerating the hand-tip to reduce the velocity error] and [Phase 4: Positioning at the object and tracking it]. In this case, the subject varies these characteristics according to the object velocity and realizes safety catch operation.

[1]  Hikaru Inooka,et al.  Motion planning for hand-over between human and robot , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[2]  Kenji Inoue,et al.  Research on Predictive Control for Problem of Catching a Fleeing Target , 1993 .

[3]  B. Allotta,et al.  Mousebuster: a robot for real-time catching , 1994, IEEE Control Systems.

[4]  Peter Corke,et al.  Controller Design for High-Performance Visual Servoing , 1993 .

[5]  François Chaumette,et al.  Tracking a Moving Object by Visual Servoing , 1993 .

[6]  宇野 洋二,et al.  Formation and control of optimal trajectory in human multijoint arm movement : minimum torque-change model , 1988 .

[7]  T. Flash,et al.  The coordination of arm movements: an experimentally confirmed mathematical model , 1985, The Journal of neuroscience : the official journal of the Society for Neuroscience.