Spoofing 2D Face Detection: Machines See People Who Aren't There

Machine learning is increasingly used to make sense of the physical world yet may suffer from adversarial manipulation. We examine the Viola-Jones 2D face detection algorithm to study whether images can be created that humans do not notice as faces yet the algorithm detects as faces. We show that it is possible to construct images that Viola-Jones recognizes as containing faces yet no human would consider a face. Moreover, we show that it is possible to construct images that fool facial detection even when they are printed and then photographed.

[1]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[2]  Paul A. Viola,et al.  Robust Real-time Object Detection , 2001 .

[3]  Samy Bengio,et al.  Adversarial examples in the physical world , 2016, ICLR.

[4]  Nguyen Minh Duc Your face is NOT your password Face Authentication ByPassing Lenovo – Asus – Toshiba , 2009 .

[5]  Jean-Luc Dugelay,et al.  Facial cosmetics database and impact analysis on automatic face recognition , 2013, 2013 IEEE 15th International Workshop on Multimedia Signal Processing (MMSP).

[6]  Andy Harter,et al.  Parameterisation of a stochastic model for human face identification , 1994, Proceedings of 1994 IEEE Workshop on Applications of Computer Vision.

[7]  Ling Huang,et al.  SAFE: Secure authentication with Face and Eyes , 2013, 2013 International Conference on Privacy and Security in Mobile Systems (PRISMS).