Fast, low power evaluation of elementary functions using radial basis function networks

Fast and efficient implementation of elementary functions such as sin(), cos(), and log() are of ample importance in a large class of applications. The state of the art methods for function evaluation involves either expensive calculations such as multiplications, large number of iterations, or large Lookup-Tables (LUTs). Higher number of iterations leads to higher latency whereas large LUTs contribute to delay, higher area requirement and higher power consumption owing to data fetching and leakage. We propose a hardware architecture for evaluating mathematical functions, consisting a small LUT and a simple Radial Basis Function Network (RBFN), a type of an Artificial Neural Network (ANN). Our proposed method evaluates trigonometric, hyperbolic, exponential, logarithmic, and square root functions. This technique finds utility in applications where the highest priority is on performance and power consumption. In contrast to traditional ANNs, our approach does not involve multiplication when determining the post synaptic states of the network. Owing to the simplicity of the approach, we were able to attain more than 2.5x power benefits and more than 1.4x performance benefits when compared with traditional approaches, under the same accuracy conditions.

[1]  John A. Nelder,et al.  A Simplex Method for Function Minimization , 1965, Comput. J..

[2]  Davide De Caro,et al.  Elementary Functions Hardware Implementation Using Constrained Piecewise-Polynomial Approximations , 2011, IEEE Transactions on Computers.

[3]  L. Trefethen,et al.  Barycentric-Remez algorithms for best polynomial approximation in the chebfun system , 2009 .

[4]  Wayne Luk,et al.  Hardware Implementation Trade-Offs of Polynomial Approximations and Interpolations , 2008, IEEE Transactions on Computers.

[5]  J. Muller Elementary Functions, Algorithms and Implementation, 2nd Edition , 2006 .

[6]  Jean-Michel Muller,et al.  Elementary Functions: Algorithms and Implementation , 1997 .

[7]  Xiaobo Sharon Hu,et al.  Expanding the Range of Convergence of the CORDIC Algorithm , 1991, IEEE Trans. Computers.

[8]  Byeong-Gyu Nam,et al.  A Reconfigurable Lighting Engine for Mobile GPU Shaders , 2015 .

[9]  Kyu-Ik Sohng,et al.  Luminance adaptation transform based on brightness functions for LDR image reproduction , 2014, Digit. Signal Process..

[10]  Javier D. Bruguera,et al.  Algorithm and architecture for logarithm, exponential, and powering computation , 2004, IEEE Transactions on Computers.

[11]  P. K. Dash,et al.  A comparative study of radial basis function network with different basis functions for stock trend prediction , 2015, 2015 IEEE Power, Communication and Information Technology Conference (PCITC).

[12]  Iasonas F. Triantis,et al.  An Integrated Analog Readout for Multi-Frequency Bioimpedance Measurements , 2014, IEEE Sensors Journal.

[13]  E. Meijering A chronology of interpolation: from ancient astronomy to modern signal and image processing , 2002, Proc. IEEE.

[14]  James E. Stine,et al.  Optimized Linear, Quadratic and Cubic Interpolators for Elementary Function Hardware Implementations , 2016 .

[15]  Tong Boon Tang,et al.  Vehicle Detection Techniques for Collision Avoidance Systems: A Review , 2015, IEEE Transactions on Intelligent Transportation Systems.

[16]  Kavita Khare,et al.  Concept, Design, and Implementation of Reconfigurable CORDIC , 2016, IEEE Transactions on Very Large Scale Integration (VLSI) Systems.

[17]  William James Cody,et al.  Software Manual for the Elementary Functions (Prentice-Hall series in computational mathematics) , 1980 .

[18]  D. Broomhead,et al.  Radial Basis Functions, Multi-Variable Functional Interpolation and Adaptive Networks , 1988 .

[19]  Jan M. Rabaey,et al.  Digital Integrated Circuits , 2003 .

[20]  E. Meijering,et al.  A chronology of interpolation: from ancient astronomy to modern signal and image processing , 2002, Proc. IEEE.

[21]  Wensheng Zhang,et al.  Generalization Performance of Radial Basis Function Networks , 2015, IEEE Transactions on Neural Networks and Learning Systems.