k-Nearest Neighbors Associative Memory Model for Face Recognition

Associative memory (AM) models for human faces recognition have been previously studied in psychology and neuroscience. A kernel based AM model (KAM) has been recently proposed and demonstrated with good recognition performances. KAM first forward transforms input space to a feature space and then reconstructs input from the kernel features. For a given subject, KAM uses all of the training samples to build the model, regardless what a query face image will be. This not only keeps unnecessary overhead for model building when the number of smaples is large, but also makes the model not robust when there are outliers in the training samples, for example, from occlusions or illumination. In this paper, an improved associative memory model is investigated by combining the KAM with the k–Nearest Neighbors classification algorithm. Named as k–Nearest Neighbors Associative Memory (kNN-AM), the model takes into account the closeness between a query face image and the training prototype face images. A modular scheme of applying the proposed kNN-AM to face recognition was discussed. As a multi-class classification problem, face recognition can be carried out by simply comparing which associative memory model best describe a given query face image. Results of extensive experiments on several well-known face database show that the kNN-AM has very satisfactory recognition accuracies.