Learning Morphological Transformations with Recurrent Neural Networks

Abstract Deep learning techniques have been successfully used in recent years to learn useful image trans- formations and features, thus contributing significantly to the advancements in neural networks. However deep nets suffer from the drawback that they require large training times and mul- tifarious parameters that need to be hand tuned for optimal performance. In this paper we investigate the use Recurrent neural network architectures to learn useful transformations of an image(object), progressively over time. Learning these latent transformations enables the recur- rent architecture to correctly predict, to a high degree of accuracy, the original representation of an object from its transformed representations.

[1]  Marvin Minsky,et al.  Perceptrons: An Introduction to Computational Geometry , 1969 .

[2]  Thomas Serre,et al.  Realistic Modeling of Simple and Complex Cell Tuning in the HMAX Model, and Implications for Invariant Object Recognition in Cortex , 2004 .

[3]  Sepp Hochreiter,et al.  The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions , 1998, Int. J. Uncertain. Fuzziness Knowl. Based Syst..

[4]  Tien D. Bui,et al.  Image Denoising Based on Wavelet Shrinkage Using Neighbor and Level Dependency , 2009, Int. J. Wavelets Multiresolution Inf. Process..

[5]  Max A. Viergever,et al.  Scale and the differential structure of images , 1992, Image Vis. Comput..

[6]  Juebang Yu,et al.  Complex-Valued Recurrent Neural Network with IIR Neuron Model: Training and Applications , 2002 .

[7]  Kunihiko Fukushima,et al.  Neocognitron: A hierarchical neural network capable of visual pattern recognition , 1988, Neural Networks.

[8]  S. Ullman Visual routines , 1984, Cognition.

[9]  M. Koivisto,et al.  Recurrent Processing in V1/V2 Contributes to Categorization of Natural Scenes , 2011, The Journal of Neuroscience.

[10]  Victor A. F. Lamme,et al.  The implementation of visual routines , 2000, Vision Research.

[11]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[12]  T. Poggio,et al.  Hierarchical models of object recognition in cortex , 1999, Nature Neuroscience.

[13]  Sven Behnke,et al.  Face localization and tracking in the neural abstraction pyramid , 2005, Neural Computing & Applications.

[14]  Jürgen Schmidhuber,et al.  Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks , 2008, NIPS.