Deep Residual Convolutional and Recurrent Neural Networks for Temperature Estimation in Permanent Magnet Synchronous Motors
暂无分享,去创建一个
Oliver Wallscheid | Joachim Böcker | Wilhelm Kirchgässner | J. Böcker | O. Wallscheid | Wilhelm Kirchgässner
[1] Andrew W. Senior,et al. Fast and accurate recurrent neural network acoustic models for speech recognition , 2015, INTERSPEECH.
[2] Wojciech Zaremba,et al. An Empirical Exploration of Recurrent Network Architectures , 2015, ICML.
[3] Joachim Bocker,et al. Determination of rotor temperature for an interior permanent magnet synchronous machine using a precise flux observer , 2014, 2014 International Power Electronics Conference (IPEC-Hiroshima 2014 - ECCE ASIA).
[4] Jian Sun,et al. Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[5] Jürgen Schmidhuber,et al. LSTM: A Search Space Odyssey , 2015, IEEE Transactions on Neural Networks and Learning Systems.
[6] Vladlen Koltun,et al. An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling , 2018, ArXiv.
[7] Oliver Wallscheid,et al. Investigation of long short-term memory networks to temperature prediction for permanent magnet synchronous motors , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).
[8] Zachary Chase Lipton. A Critical Review of Recurrent Neural Networks for Sequence Learning , 2015, ArXiv.
[9] Zoubin Ghahramani,et al. A Theoretically Grounded Application of Dropout in Recurrent Neural Networks , 2015, NIPS.
[10] Geoffrey E. Hinton,et al. Deep Learning , 2015, Nature.
[11] Geoffrey E. Hinton,et al. A time-delay neural network architecture for isolated word recognition , 1990, Neural Networks.
[12] Yoshua Bengio,et al. Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.
[13] J. Schmidhuber,et al. Framewise phoneme classification with bidirectional LSTM networks , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..
[14] Yoshua Bengio,et al. Random Search for Hyper-Parameter Optimization , 2012, J. Mach. Learn. Res..
[15] David Reigosa,et al. Magnet Temperature Estimation in Surface PM Machines Using High-Frequency Signal Injection , 2010 .
[16] Heiga Zen,et al. WaveNet: A Generative Model for Raw Audio , 2016, SSW.
[17] Jürgen Schmidhuber,et al. Learning to forget: continual prediction with LSTM , 1999 .
[18] Yoshua Bengio,et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.
[19] Anders Krogh,et al. A Simple Weight Decay Can Improve Generalization , 1991, NIPS.
[20] Nando de Freitas,et al. Taking the Human Out of the Loop: A Review of Bayesian Optimization , 2016, Proceedings of the IEEE.
[21] Tobias Huber,et al. Real-time capable methods to determine the magnet temperature of permanent magnet synchronous motors — A review , 2014, IECON 2014 - 40th Annual Conference of the IEEE Industrial Electronics Society.
[22] S D Wilson,et al. Methods of Resistance Estimation in Permanent Magnet Synchronous Motors for Real-Time Thermal Management , 2010, IEEE Transactions on Energy Conversion.
[23] Joachim Bocker,et al. Monitoring critical temperatures in permanent magnet synchronous motors using low-order thermal models , 2014, 2014 International Power Electronics Conference (IPEC-Hiroshima 2014 - ECCE ASIA).
[24] Joachim Bocker,et al. Global Identification of a Low-Order Lumped-Parameter Thermal Network for Permanent Magnet Synchronous Motors , 2016, IEEE Transactions on Energy Conversion.
[25] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[26] Kurt Hornik,et al. Multilayer feedforward networks are universal approximators , 1989, Neural Networks.
[27] Yoshua Bengio,et al. SampleRNN: An Unconditional End-to-End Neural Audio Generation Model , 2016, ICLR.
[28] Christian Kral,et al. Sensorless rotor temperature estimation of permanent magnet synchronous motor , 2011, IECON 2011 - 37th Annual Conference of the IEEE Industrial Electronics Society.
[29] Yann Ollivier,et al. Can recurrent neural networks warp time? , 2018, ICLR.
[30] Razvan Pascanu,et al. Understanding the exploding gradient problem , 2012, ArXiv.
[31] Quoc V. Le,et al. Adding Gradient Noise Improves Learning for Very Deep Networks , 2015, ArXiv.
[32] Quoc V. Le,et al. Addressing the Rare Word Problem in Neural Machine Translation , 2014, ACL.