Hardware Implementation of Tanh Exponential Activation Function using FPGA

The most active research area for Field Programmable Gate Arrays is the Convolution Neural Network (CNN), and the gist of any CNN is an activation function. Therefore, various non-linear activation functions are required for deeper CNN. In this paper, we aim to implement the Tanh Exponential (TanhExp) activation function on Artix-7 and Zynq-7000. To this end, we will use the piecewise linear approximation and the second-order polynomial approximation while using the IEEE754 2008 floating-point representation. We present an investigation of the required hardware resources. We also evaluate the efficiency of each method of approximation and its derivative.

[1]  Ivan Tsmots,et al.  Hardware Implementation of Sigmoid Activation Functions using FPGA , 2019, 2019 IEEE 15th International Conference on the Experience of Designing and Application of CAD Systems (CADSM).

[2]  Axel Jantsch,et al.  Neural network based ECG anomaly detection on FPGA and trade-off analysis , 2017, 2017 IEEE International Symposium on Circuits and Systems (ISCAS).

[3]  Z. Hajduk Hardware implementation of hyperbolic tangent and sigmoid activation functions , 2023, Bulletin of the Polish Academy of Sciences: Technical Sciences.

[4]  Kilian Q. Weinberger,et al.  Densely Connected Convolutional Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Ali Özgür Yılmaz,et al.  Experimental Analysis and FPGA Implementation of the Real Valued Time Delay Neural Network Based Digital Predistortion , 2019, 2019 26th IEEE International Conference on Electronics, Circuits and Systems (ICECS).

[6]  Yu-Jung Huang,et al.  FPGA realization of activation function for neural network , 2018, 2018 7th International Symposium on Next Generation Electronics (ISNE).

[7]  Diganta Misra,et al.  Mish: A Self Regularized Non-Monotonic Neural Activation Function , 2019, ArXiv.

[8]  Zhun Fan,et al.  A Twofold Lookup Table Architecture for Efficient Approximation of Activation Functions , 2020, IEEE Transactions on Very Large Scale Integration (VLSI) Systems.

[9]  Francesco Piazza,et al.  Neural networks with digital LUT activation functions , 1993, Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan).

[10]  Dumitru Erhan,et al.  Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[11]  Nadia Nedjah,et al.  Compact yet efficient hardware implementation of artificial neural networks with customized topology , 2012, Expert Syst. Appl..