Sensing Movement on Smartphone Devices to Assess User Interaction for Face Verification

Unlocking and protecting smartphone devices has become easier with the introduction of biometric face verification, as it has the promise of a secure and quick authentication solution to prevent unauthorised access. However, there are still many challenges for this biometric modality in a mobile context, where the user's posture and capture device are not constrained. This research proposes a method to assess user interaction by analysing sensor data collected in the background of smartphone devices during verification sample capture. From accelerometer data, we have extracted magnitude variations and angular acceleration for pitch, roll, and yaw (angles around the x-axis, y-axis, and z-axis of the smartphone respectively) as features to describe the amplitude and number of movements during a facial image capture process. Results obtained from this experiment demonstrate that it can be possible to ensure good sample quality and high biometric performance by applying an appropriate threshold that will regulate the amplitude on variations of the smartphone movements during facial image capture. Moreover, the results suggest that better quality images are obtained when users spend more time positioning the smartphone before taking an image.