Recognition of Human Identity by Detection of User Activity

The paper describes a system able to recognize the users identity according how she/he looks at the monitor while using a given interface. The system does not need invasive measurements that could limit the naturalness of her/his actions. The proposed approach clusters the sequences of observed points on the screen and characterizes the user identity according the relevant detected patterns. Moreover, the system is able to identify patterns in order to have a more accurate recognition and to create prototypes of natural facial dynamics in user expressions. The possibility to characterize people through facial movements introduces a new perspective on human-machine interaction. For example, a user can obtain different contents according her/his mood or a software interface can modify itself to keep a higher attention from a bored user. The success rate of the classification using only 7 parameters is around 68%. The approach is based on k-means that is tuned to maximize an index involving the number of true-positive detections and conditional probabilities. A different evaluation of this parameter allows to focus on the identification of a single user or to spot a general movement for a wide range of people The experiments show that the performance can reach the 90% of correct recognition.

[1]  Roberto Pirrone,et al.  Biologically Inspired Cognitive Architectures 2012 - Proceedings of the Third Annual Meeting of the BICA Society, Palermo, Sicily, Italy, October 31 - November 3, 2012 , 2012, BICA.

[2]  Sangho Park,et al.  Human motion: modeling and recognition of actions and interactions , 2004 .

[3]  John M. Carroll,et al.  Mental Models in Human-Computer Interaction , 1988 .

[4]  Salvatore Gaglio,et al.  A Framework for Sign Language Sentence Recognition by Commonsense Context , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[5]  Raymond J. Mooney,et al.  A probabilistic framework for semi-supervised clustering , 2004, KDD.

[6]  Mircea Nicolescu,et al.  Understanding Activities and Intentions for Human-Robot Interaction , 2010 .

[7]  R. E. Kalman,et al.  A New Approach to Linear Filtering and Prediction Problems , 2002 .

[8]  Monica N. Nicolescu,et al.  Understanding human intentions via Hidden Markov Models in autonomous mobile robots , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Gregory D. Abowd,et al.  The smart floor: a mechanism for natural user identification and tracking , 2000, CHI Extended Abstracts.

[10]  Giovanni Pilato,et al.  I Feel Blue: Robots and Humans Sharing Color Representation for Emotional Cognitive Interaction , 2012, BICA.

[11]  P. Johnson-Laird Mental models , 1989 .

[12]  Ignazio Infantino,et al.  Implementation of an Intentional Vision System to Support Cognitive Architectures , 2016, VISAPP.

[13]  Giovanni Pilato,et al.  Vision and emotional flow in a cognitive architecture for human-machine interaction , 2011, BICA.

[14]  T. Landauer,et al.  Handbook of Human-Computer Interaction , 1997 .

[15]  E. Mayoraz,et al.  Fusion of face and speech data for person identity verification , 1999, IEEE Trans. Neural Networks.

[16]  Matthew Turk,et al.  Computer vision in the interface , 2004, CACM.

[17]  Maja Pantic,et al.  Dynamics of facial expression: recognition of facial actions and their temporal segments from face profile image sequences , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[18]  Claudia Picardi,et al.  Identity verification through dynamic keystroke analysis , 2003, Intell. Data Anal..