A modular neural network for super-resolution of human faces

Abstract This paper presents the original and versatile architecture of a modular neural network and its application to super-resolution. Each module is a small multilayer perceptron, trained with the Levenberg-Marquardt method, and is used as a generic building block. By connecting the modules together to establish a composition of their individual mappings, we elaborate a lattice of modules that implements full connectivity between the pixels of the low-resolution input image and those of the higher-resolution output image. After the network is trained with patterns made up of low and high-resolution images of objects or scenes of the same kind, it will be able to enhance dramatically the resolution of a similar object’s representation. The modular nature of the architecture allows the training phase to be readily parallelized on a network of PCs. Finally, it is shown that the network performs global-scale reconstruction of human faces from very low resolution input images.

[1]  Robert L. Stevenson,et al.  Extraction of high-resolution frames from video sequences , 1996, IEEE Trans. Image Process..

[2]  W. Chase,et al.  Visual information processing. , 1974 .

[3]  Sherif Hashem,et al.  Optimal Linear Combinations of Neural Networks , 1997, Neural Networks.

[4]  Z. Pylyshyn Return of the mental image: are there really pictures in the brain? , 2003, Trends in Cognitive Sciences.

[5]  Geoffrey E. Hinton Connectionist Learning Procedures , 1989, Artif. Intell..

[6]  Liming Zhang,et al.  A new method of images super-resolution restoration by neural networks , 2002, Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02..

[7]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[8]  Stephen Grossberg,et al.  Neural substrates of visual percepts, imagery, and hallucinations , 2002, Behavioral and Brain Sciences.

[9]  Russell C. Hardie,et al.  Joint MAP registration and high-resolution image estimation using a sequence of undersampled images , 1997, IEEE Trans. Image Process..

[10]  Mohammad A. Karim,et al.  High-fidelity image interpolation using radial basis function neural networks , 1995, Proceedings of the IEEE 1995 National Aerospace and Electronics Conference. NAECON 1995.

[11]  Harry Shum,et al.  A two-step approach to hallucinating faces: global parametric model and local nonparametric model , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[12]  R. Shepard,et al.  Mental Images and Their Transformations , 1982 .

[13]  R. Shepard,et al.  CHRONOMETRIC STUDIES OF THE ROTATION OF MENTAL IMAGES , 1973 .

[14]  Alexander H. Waibel,et al.  Modular Construction of Time-Delay Neural Networks for Speech Recognition , 1989, Neural Computation.

[15]  R. Shepard,et al.  Mental Rotation of Three-Dimensional Objects , 1971, Science.

[16]  H. Hackbarth,et al.  Modular connectionist structure for 100-word recognition , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[17]  D. Marquardt An Algorithm for Least-Squares Estimation of Nonlinear Parameters , 1963 .

[18]  Nathalie Plaziac Image interpolation using neural networks , 1999, IEEE Trans. Image Process..

[19]  David Salesin,et al.  Image Analogies , 2001, SIGGRAPH.

[20]  Liangpei Zhang,et al.  A MAP algorithm to super-resolution image reconstruction , 2004, Third International Conference on Image and Graphics (ICIG'04).

[21]  William T. Freeman,et al.  Example-Based Super-Resolution , 2002, IEEE Computer Graphics and Applications.

[22]  L. Cooper Demonstration of a mental analog of an external rotation , 1976 .

[23]  Manuel Carcenac,et al.  A modular neural network applied to image transformation and mental images , 2008, Neural Computing and Applications.

[24]  Takeo Kanade,et al.  Limits on super-resolution and how to break them , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).