Dynamic Content Mining

Neural networks and convolutional neural networks can be considered as functions which take as input a vector and compute a distribution over the set of possible classes. Such networks have no notion of order in time nor in memory. That is they are not suitable for dynamic content mining like speech recognition, video processing, etc. In this chapter we introduce models able to handle temporality of visual content.

[1]  David L. Neuhoff,et al.  The Viterbi algorithm as an aid in text recognition (Corresp.) , 1975, IEEE Trans. Inf. Theory.

[2]  Yoshua Bengio,et al.  On the Properties of Neural Machine Translation: Encoder–Decoder Approaches , 2014, SSST@EMNLP.

[3]  L. Baum,et al.  A Maximization Technique Occurring in the Statistical Analysis of Probabilistic Functions of Markov Chains , 1970 .

[4]  Razvan Pascanu,et al.  On the difficulty of training recurrent neural networks , 2012, ICML.

[5]  Martin T. Hagan,et al.  Backpropagation Through Time for General Dynamic Networks , 2008, IC-AI.

[6]  Andrew J. Viterbi,et al.  Error bounds for convolutional codes and an asymptotically optimum decoding algorithm , 1967, IEEE Trans. Inf. Theory.

[7]  Patrick Gros,et al.  Temporal structure analysis of broadcast tennis video using hidden Markov models , 2003, IS&T/SPIE Electronic Imaging.

[8]  Chuanyi Ji,et al.  A unified approach on fast training of feedforward and recurrent networks using EM algorithm , 1998, IEEE Trans. Signal Process..

[9]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[10]  Jenny Benois-Pineau,et al.  Hierarchical Hidden Markov Model in detecting activities of daily living in wearable videos for studies of dementia , 2011, Multimedia Tools and Applications.

[11]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[12]  Geoffrey E. Hinton,et al.  On the importance of initialization and momentum in deep learning , 2013, ICML.

[13]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.