Liver cancer is one of the highest causes of death in the world. According to IARC data in 2018, the number of cases of liver cancer reached 841,080 cases and 781,631 of them declared dead. One effort that can be done to reduce cases of death from liver cancer is early detection by doing a classification on MicroRNA data. MicroRNA can be used to identify whether a cell is a cancer cell or not at the earliest stages. MicroRNA data studied was obtained from the GDC Data Portal of the National Cancer Institute (NCI). The Deep Neural Network (DNN) method is one of the methods that can be used in classifying cancer. Data normalization in the DNN method brings input values with an unlimited range into a limited range of output values. Data normalization at data pre-processing work more optimally by adding activation functions and Batch Normalization in each hidden layer. Different types of normalization and use of different activation functions can be compared to get the best accuracy value for DNN. This study compares three types of data normalization, namely Min-Max, Sigmoid, and Softmax. Whereas the activation functions being compared are ReLU, Sigmoid, and TanH. The results showed the best accuracy for the classification of liver cancer with MicroRNA data obtained using Min-Max data normalization, ReLU activation function, batch normalization with parameters of 2 hidden layers, 200 epochs, and 0.004 learning rate with an accuracy value reaching 98.33%.
[1]
Ömer Faruk Ertugrul,et al.
A novel type of activation function in artificial neural networks: Trained activation function
,
2018,
Neural Networks.
[2]
Nitish Srivastava,et al.
Dropout: a simple way to prevent neural networks from overfitting
,
2014,
J. Mach. Learn. Res..
[3]
Geoffrey E. Hinton,et al.
Deep Learning
,
2015,
Nature.
[4]
Jianlin Cheng,et al.
A Deep Learning Network Approach to ab initio Protein Secondary Structure Prediction
,
2015,
IEEE/ACM Transactions on Computational Biology and Bioinformatics.
[5]
Geoffrey E. Hinton,et al.
Rectified Linear Units Improve Restricted Boltzmann Machines
,
2010,
ICML.