A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera

Estimating distances between people and robots plays a crucial role in understanding social Human–Robot Interaction (HRI) from an egocentric view. It is a key step if robots should engage in social interactions, and to collaborate with people as part of human–robot teams. For distance estimation between a person and a robot, different sensors can be employed, and the number of challenges to be addressed by the distance estimation methods rise with the simplicity of the technology of a sensor. In the case of estimating distances using individual images from a single camera in a egocentric position, it is often required that individuals in the scene are facing the camera, do not occlude each other, and are fairly visible so specific facial or body features can be identified. In this paper, we propose a novel method for estimating distances between a robot and people using single images from a single egocentric camera. The method is based on previously proven 2D pose estimation, which allows partial occlusions, cluttered background, and relatively low resolution. The method estimates distance with respect to the camera based on the Euclidean distance between ear and torso of people in the image plane. Ear and torso characteristic points has been selected based on their relatively high visibility regardless of a person orientation and a certain degree of uniformity with regard to the age and gender. Experimental validation demonstrates effectiveness of the proposed method.

[1]  Chen-Chien James Hsu,et al.  Three-dimensional measurement of a remote object with a single CCD camera , 2000, 2009 4th International Conference on Autonomous Robots and Agents.

[2]  Hee-Jun Kang,et al.  Distance estimation using inertial sensor and vision , 2013 .

[3]  Marcelo H. Ang,et al.  A Survey on Perception Methods for Human–Robot Interaction in Social Robots , 2013, International Journal of Social Robotics.

[4]  Munsang Kim,et al.  Easy Interface and Control of Tele-education Robots , 2013, Int. J. Soc. Robotics.

[5]  J. B. Pittenger,et al.  The perception of human growth. , 1980, Scientific American.

[6]  Max Q.-H. Meng,et al.  Humanoid robot locomotion control by posture recognition for human-robot interaction , 2015, 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[7]  E. S. Pearson,et al.  Tests for departure from normality. Empirical results for the distributions of b2 and √b1 , 1973 .

[8]  Raimondo Schettini,et al.  A unifying representation for pixel-precise distance estimation , 2018, Multimedia Tools and Applications.

[9]  Adam Kendon,et al.  Spacing and Orientation in Co-present Interaction , 2009, COST 2102 Training School.

[10]  Takayuki Kanda,et al.  Estimating Children’s Social Status Through Their Interaction Activities in Classrooms with a Social Robot , 2019, Int. J. Soc. Robotics.

[11]  Sebastian Thrun,et al.  Unsupervised Intrinsic Calibration of Depth Sensors via SLAM , 2013, Robotics: Science and Systems.

[12]  Zhe Zhang,et al.  Human Body Pose Interpretation and Classification for Social Human-Robot Interaction , 2011, Int. J. Soc. Robotics.

[13]  Cynthia Breazeal,et al.  Social interactions in HRI: the robot view , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[14]  Simone Calderara,et al.  Understanding social relationships in egocentric vision , 2015, Pattern Recognit..

[15]  R. Danescu,et al.  High accuracy stereo vision system for far distance obstacle detection , 2004, IEEE Intelligent Vehicles Symposium, 2004.

[16]  Silvia Coradeschi,et al.  A Review of Mobile Robotic Telepresence , 2013, Adv. Hum. Comput. Interact..

[17]  Klaus David,et al.  A new context: Screen to face distance , 2014, 2014 8th International Symposium on Medical Information and Communication Technology (ISMICT).

[18]  Wolfram Burgard,et al.  Multi-model Hypothesis Group Tracking and Group Size Estimation , 2010, Int. J. Soc. Robotics.

[19]  Weipeng Liu,et al.  People-following system design for mobile robots using kinect sensor , 2013, 2013 25th Chinese Control and Decision Conference (CCDC).

[20]  Simon A. Dobson,et al.  Robust High-Accuracy Ultrasonic Range Measurement System , 2011, IEEE Transactions on Instrumentation and Measurement.

[21]  E. Hall The hidden dimension: an anthropologist examines man's use of space in public and private , 1969 .

[22]  Alessandro G. Di Nuovo,et al.  Deep Learning Systems for Estimating Visual Attention in Robot-Assisted Therapy of Children with Autism and Intellectual Disability , 2018, Robotics.

[23]  Shamsudin H. M. Amin,et al.  Determining subject distance based on face size , 2015, 2015 10th Asian Control Conference (ASCC).

[24]  José-Enrique Simó-Ten,et al.  Using infrared sensors for distance measurement in mobile robots , 2002, Robotics Auton. Syst..

[25]  Pietro Perona,et al.  Distance Estimation of an Unknown Person from a Portrait , 2014, ECCV.

[26]  Amy Loutfi,et al.  Estimating F-Formations for Mobile Robotic Telepresence , 2017, HRI.

[27]  Yaser Sheikh,et al.  OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.