Social signs processing in a cognitive architecture for an humanoid robot

Abstract A social robot has to recognize human social intention in order to fully interact with him/her. People intention can be inferred by processing verbal and non-verbal communicative signs. In this work we describe an actions classification module embedded into a robot’s cognitive architecture, contributing to the interpretation of users behavior.