The MOBOT Human-Robot Interaction: Showcasing Assistive HRI

The MOBOT project has envisioned the development of cognitive robotic assistant prototypes that act proactively, adaptively and interactively with respect to elderly humans with slight walking and cognitive difficulties. To meet the project's goals, a multimodal action recognition system is being developed to monitor, analyse and predict user actions with a high level of accuracy and detail. Here we discuss how the analysis of human behaviour data that have become available through the annotation and study of the project's multimodal-multisensory corpus, have led to the modelling of Human-Robot Communication in order to achieve an effective, natural interaction between users and the assistive robotic platform, intending to show how the project's communication model has been integrated in the robotic platform in order to support a natural multimodal human-robot interaction as verified by systematic end user validation cycles.

[1]  Iasonas Kokkinos,et al.  Advances in Intelligent Mobility Assistance Robot Integrating Multimodal Sensory Processing , 2014, HCI.

[2]  Petros Maragos,et al.  Multichannel speech enhancement using MEMS microphones , 2015, 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[3]  Kazuhiro Kosuge,et al.  Fall prevention control of passive intelligent walker based on human model , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Angelika Peer,et al.  Deciding on optimal assistance policies in haptic shared control tasks , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[5]  Iasonas Kokkinos,et al.  Segmentation-Aware Deformable Part Models , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[6]  Effie Papageorgiou,et al.  Development of a New Psychometric Scale (PYTHEIA) to Assess the Satisfaction of Users with Any Assistive Technology , 2017 .

[7]  M. Hirvensalo,et al.  Mobility Difficulties and Physical Activity as Predictors of Mortality and Loss of Independence in the Community‐Living Older Population , 2000, Journal of the American Geriatrics Society.

[8]  Daisuke Chugo,et al.  A rehabilitation walker with standing and walking assistance , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Iasonas Kokkinos,et al.  Discriminative learning of deformable contour models , 2014, 2014 IEEE 11th International Symposium on Biomedical Imaging (ISBI).

[10]  Dima Damen,et al.  Recognizing linked events: Searching the space of feasible explanations , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  Petros Maragos,et al.  The MOBOT human-robot communication model , 2015, 2015 6th IEEE International Conference on Cognitive Infocommunications (CogInfoCom).

[12]  Petros Maragos,et al.  The MOBOT rollator human-robot interaction model and user evaluation process , 2016, 2016 IEEE Symposium Series on Computational Intelligence (SSCI).

[13]  Petros Maragos,et al.  Hidden Markov modeling of human normal gait using laser range finder for a mobility assistance robot , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[14]  Effie Papageorgiou,et al.  PYTHEIA: A Scale for Assessing Rehabilitation and Assistive Robotics , 2016 .

[15]  Kazuhiro Kosuge,et al.  Approach in Assisting a Sit-to-Stand Movement Using Robotic Walking Support System , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Iasonas Kokkinos,et al.  Dense Segmentation-Aware Descriptors , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[17]  Jian Huang,et al.  Human-Walking-Intention-Based Motion Control of an Omnidirectional-Type Cane Robot , 2013, IEEE/ASME Transactions on Mechatronics.

[18]  Petros Maragos,et al.  Multimodal gesture recognition via multiple hypotheses rescoring , 2015, J. Mach. Learn. Res..