Oil Reservoir Classification Based on Convolutional Neural Network

According to the existing oil reservoir data, the efficient and the accurate classification of the oil reservoir is a key factor for the petrochemical enterprises to improve the efficiency of resources utilization. However, the chromatogram data of the oil reservoir has characteristics of high dimensions, complexity and noise. Therefore, a classification model based on the convolutional neural network(CNN) is proposed to learn automatically features from the sequence data of the oil reservoir and avoid the complex feature engineering. The synthetic minority over-sampling technique(SMOTE) algorithm is used to dispose unbalanced categories samples to reduce the rate of the misclassification. Finally, the proposed method is applied in J16 oil dataset taken from an oil field of China petrochemical industry. The result shows that the accurate classification of the oil reservoir can achieve about 80.8%.

[1]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[2]  Serena Ng,et al.  Tests for Skewness, Kurtosis, and Normality for Time Series Data , 2005 .

[3]  Behzad Vaferi,et al.  Application of Recurrent Networks to Classification of Oil Reservoir Models in Well-testing Analysis , 2015 .

[4]  Shuai Li,et al.  Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[5]  Xiaoli Li,et al.  Deep Convolutional Neural Networks on Multichannel Time Series for Human Activity Recognition , 2015, IJCAI.

[6]  Mark Johnston,et al.  Reusing Genetic Programming for Ensemble Selection in Classification of Unbalanced Data , 2014, IEEE Transactions on Evolutionary Computation.

[7]  Hua Cai,et al.  Reservoir Classification and Evaluation Methods Based on R35 Pore Throat Radius , 2015, ICIEA 2015.

[8]  K. Becker,et al.  Analysis of microarray data using Z score transformation. , 2003, The Journal of molecular diagnostics : JMD.

[9]  Simon Fong,et al.  An Application of Oversampling, Undersampling, Bagging and Boosting in Handling Imbalanced Datasets , 2013, DaEng.

[10]  Hui Han,et al.  Borderline-SMOTE: A New Over-Sampling Method in Imbalanced Data Sets Learning , 2005, ICIC.

[11]  Gerald Penn,et al.  Convolutional Neural Networks for Speech Recognition , 2014, IEEE/ACM Transactions on Audio, Speech, and Language Processing.

[12]  Andrew W. Senior,et al.  Long short-term memory recurrent neural network architectures for large scale acoustic modeling , 2014, INTERSPEECH.

[13]  Michael Y. Hu,et al.  Effect of data standardization on neural network training , 1996 .

[14]  Ismail Mohamad,et al.  Standardization and Its Effects on K-Means Clustering Algorithm , 2013 .

[15]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[16]  X. Xiao,et al.  Classification of Tight Sandstone Reservoir Based on the Conventional Logging , 2014 .

[17]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[18]  Fernando Nogueira,et al.  Imbalanced-learn: A Python Toolbox to Tackle the Curse of Imbalanced Datasets in Machine Learning , 2016, J. Mach. Learn. Res..

[19]  Haibo He,et al.  Learning from Imbalanced Data , 2009, IEEE Transactions on Knowledge and Data Engineering.