User Detection, Tracking and Recognition in Robot Assistive Care Scenarios

The field of assistive robotics is gaining traction in both research as well as industry communities. However, capabilities of existing robotic platforms still require improvements in order to implement meaningful human-robot interactions. We report on the design and implementation of an external system that significantly augments the person detection, tracking and identification capabilities of the Pepper robot. We perform a qualitative analysis of the improvements achieved by each system module under different interaction conditions and evaluate the whole system on hand of a scenario for elderly care assistance.

[1]  Adina Magda Florea,et al.  A Novel Integrated Architecture for Ambient Assisted Living Systems , 2017, 2017 IEEE 41st Annual Computer Software and Applications Conference (COMPSAC).

[2]  Gary R. Bradski,et al.  ORB: An efficient alternative to SIFT or SURF , 2011, 2011 International Conference on Computer Vision.

[3]  Fumihide Tanaka,et al.  Pepper learns together with children: Development of an educational application , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[4]  Ali Farhadi,et al.  YOLO9000: Better, Faster, Stronger , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Ali Farhadi,et al.  You Only Look Once: Unified, Real-Time Object Detection , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  James Philbin,et al.  FaceNet: A unified embedding for face recognition and clustering , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[7]  Adina Magda Florea,et al.  Multimodal Interface for Elderly People , 2017, 2017 21st International Conference on Control Systems and Computer Science (CSCS).

[8]  Manuela M. Veloso,et al.  Setting Up Pepper For Autonomous Navigation And Personalized Interaction With Users , 2017, ArXiv.