SEPO: Selecting by pointing as an intuitive human-robot command interface

Pointing to indicate direction or position is one of the intuitive communication mechanisms used by humans in all life stages. Our aim is to develop a natural human-robot command interface using pointing gestures for human-robot interaction (HRI). We propose an interface based on the Kinect sensor for selecting by pointing (SEPO) in a 3D real-world situation, where the user points to a target object or location and the interface returns the 3D position coordinates of the target. Through our interface we perform three experiments to study precision and accuracy of human pointing in typical household scenarios: pointing to a “wall”, pointing to a “table”, and pointing to a “floor”. Our results prove that the proposed SEPO interface enables users to point and select objects with an average 3D position accuracy of 9:6 cm in household situations.

[1]  Michitaka Hirose,et al.  3D User Interfaces: New Directions and Perspectives , 2008, IEEE Computer Graphics and Applications.

[2]  Daniel Herrera C,et al.  Joint depth and color camera calibration with distortion correction. , 2012, IEEE transactions on pattern analysis and machine intelligence.

[3]  Nebojsa Jojic,et al.  Detection and estimation of pointing gestures in dense disparity maps , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[4]  G. Butterworth,et al.  The development of sensory, motor and cognitive capacities in early infancy : from perception to cognition , 1998 .

[5]  Luc Van Gool,et al.  Real-time pointing gesture recognition for an immersive environment , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[6]  Radu Bogdan Rusu,et al.  3D is here: Point Cloud Library (PCL) , 2011, 2011 IEEE International Conference on Robotics and Automation.

[7]  Rainer Stiefelhagen,et al.  3D-tracking of head and hands for pointing gesture recognition in a human-robot interaction scenario , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[8]  Gary Bradski,et al.  Computer Vision Face Tracking For Use in a Perceptual User Interface , 1998 .

[9]  Rainer Stiefelhagen,et al.  Pointing gesture recognition based on 3D-tracking of face, hands and head orientation , 2003, ICMI '03.

[10]  Zhe Xu,et al.  A point-and-click interface for the real world: Laser designation of objects for mobile manipulation , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[11]  R. E. Kahn,et al.  Understanding people pointing: the Perseus system , 1995, Proceedings of International Symposium on Computer Vision - ISCV.