Hardware Implementation of Hyperbolic Tangent Activation Function for Floating Point Formats
暂无分享,去创建一个
[1] M. Nirmala Devi,et al. FPGA Realization of Activation Function for Artificial Neural Networks , 2008, 2008 Eighth International Conference on Intelligent Systems Design and Applications.
[2] Mitra Mirhassani,et al. Efficient VLSI Implementation of Neural Networks With Hyperbolic Tangent Activation Function , 2014, IEEE Transactions on Very Large Scale Integration (VLSI) Systems.
[3] Maicon A. Sartin,et al. Approximation of hyperbolic tangent activation function using hybrid methods , 2013, 2013 8th International Workshop on Reconfigurable and Communication-Centric Systems-on-Chip (ReCoSoC).
[4] Nanning Zheng,et al. Design Space Exploration of Neural Network Activation Function Circuits , 2018, IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems.
[5] David A. Patterson,et al. In-datacenter performance analysis of a tensor processing unit , 2017, 2017 ACM/IEEE 44th Annual International Symposium on Computer Architecture (ISCA).
[6] Huapeng Wu,et al. High Speed VLSI Implementation of the Hyperbolic Tangent Sigmoid Function , 2008, 2008 Third International Conference on Convergence and Hybrid Information Technology.
[7] N. Burgess,et al. Some results on Taylor-series function approximation on FPGA , 2003, The Thrity-Seventh Asilomar Conference on Signals, Systems & Computers, 2003.
[8] Aman Jantan,et al. State-of-the-art in artificial neural network applications: A survey , 2018, Heliyon.
[9] L. Fanucci,et al. Low-error digital hardware implementation of artificial neuron activation functions and their derivative , 2011, Microprocess. Microsystems.
[10] Majid Ahmadi,et al. Precise digital implementations of hyperbolic tanh and sigmoid function , 2016, 2016 50th Asilomar Conference on Signals, Systems and Computers.
[11] Hyeong-Ju Kang,et al. Short floating-point representation for convolutional neural network inference , 2019, IEICE Electron. Express.
[12] J. M. Tarela,et al. Approximation of sigmoid function and the derivative for hardware implementation of artificial neurons , 2004 .
[13] Pramod Kumar Meher. An optimized lookup-table for the evaluation of sigmoid function for artificial neural networks , 2010, 2010 18th IEEE/IFIP International Conference on VLSI and System-on-Chip.
[14] M. Masmoudi,et al. Implementations approches of neural networks lane following system , 2012, 2012 16th IEEE Mediterranean Electrotechnical Conference.
[15] Xiaofei Wang,et al. Convergence of Edge Computing and Deep Learning: A Comprehensive Survey , 2019, IEEE Communications Surveys & Tutorials.
[16] Maurizio Valle,et al. Tunable Floating-Point for Artificial Neural Networks , 2018, 2018 25th IEEE International Conference on Electronics, Circuits and Systems (ICECS).
[17] Stamatis Vassiliadis,et al. Sigmoid Generators for Neural Computing Using Piecewise Approximations , 1996, IEEE Trans. Computers.
[18] Majid Ahmadi,et al. Efficient hardware implementation of the hyperbolic tangent sigmoid function , 2009, 2009 IEEE International Symposium on Circuits and Systems.
[19] S. Hyakin,et al. Neural Networks: A Comprehensive Foundation , 1994 .
[20] Peter Nilsson,et al. Hardware implementation of the exponential function using Taylor series , 2014, 2014 NORCHIP.
[21] Jeen-Shing Wang,et al. A digital circuit design of hyperbolic tangent sigmoid function for neural networks , 2008, 2008 IEEE International Symposium on Circuits and Systems.