Training Deep Neural Networks with Low Precision Input Data: A Hurricane Prediction Case Study

Training deep neural networks requires huge amounts of data. The next generation of intelligent systems will generate and utilise massive amounts of data which will be transferred along machine learning workflows. We study the effect of reducing the precision of this data at early stages of the workflow (i.e. input) on both prediction accuracy and learning behaviour of deep neural networks. We show that high precision data can be transformed to low precision before feeding it to a neural network model with insignificant depreciation in accuracy. As such, a high precision representation of input data is not entirely necessary for some applications. The findings of this study pave way for the application of deep learning in areas where acquiring high precision data is difficult due to both memory and computational power constraints. We further use a hurricane prediction case study where we predict the monthly number of hurricanes on the Atlantic Ocean using deep neural networks. We train a deep neural network model that predicts the number of hurricanes, first, by using high precision input data and then by using low precision data. This leads to only a drop in prediction accuracy of less than 2%.

[1]  Vincent Vanhoucke,et al.  Improving the speed of neural networks on CPUs , 2011 .

[2]  Prabhat,et al.  Application of Deep Convolutional Neural Networks for Detecting Extreme Weather in Climate Datasets , 2016, ArXiv.

[3]  Pritish Narayanan,et al.  Deep Learning with Limited Numerical Precision , 2015, ICML.

[4]  Muhammad Shafique,et al.  Adaptive and Energy-Efficient Architectures for Machine Learning: Challenges, Opportunities, and Research Roadmap , 2017, 2017 IEEE Computer Society Annual Symposium on VLSI (ISVLSI).

[5]  Peter Norvig,et al.  The Unreasonable Effectiveness of Data , 2009, IEEE Intelligent Systems.

[6]  Ming Zhao,et al.  Retrospective Forecasts of the Hurricane Season Using a Global Atmospheric Model Assuming Persistence of SST Anomalies , 2010 .

[7]  Shuang Wu,et al.  Training and Inference with Integers in Deep Neural Networks , 2018, ICLR.

[8]  Vijayalakshmi Srinivasan,et al.  Approximate computing: Challenges and opportunities , 2016, 2016 IEEE International Conference on Rebooting Computing (ICRC).

[9]  Marc'Aurelio Ranzato,et al.  Large Scale Distributed Deep Networks , 2012, NIPS.

[10]  Yoshua Bengio,et al.  Low precision arithmetic for deep learning , 2014, ICLR.

[11]  Michael B. Richman,et al.  Reducing Tropical Cyclone Prediction Errors Using Machine Learning Approaches , 2017 .

[12]  Wei Zhang,et al.  Application of Multi-channel 3D-cube Successive Convolution Network for Convective Storm Nowcasting , 2017, 2019 IEEE International Conference on Big Data (Big Data).