Special function neural network (SFNN) models
暂无分享,去创建一个
Robust implementations of special functions have been a concern in many scientific areas, from electromagnetics to statistics. For example, the kernel of the Helmholtz equation in a boundary integral formulation is based on Hankel functions, or the Matérn covariance in statistics depends on the functions Gamma and modified Bessel. Traditionally these special functions are implemented using known asymptotic expansions on certain critical intervals. The strategy we introduce here is to replace asymptotic expansions with neural network (NN) models taking advantage that NNs can be provably considered to be universal approximators. This approach facilitates a plethora of operations previously inaccessible. For instance, high-order derivatives of a neural network model preserve the accuracy of the trained model and, as such, can be more reliable than derivatives of asymptotic expansions. Implementations of series expansions may be computationally prohibitive and prone to numerical errors in regions where they do not converge sufficiently fast. In the current work, we develop neural network models to be a stand-in for special functions, focusing on the Bessel functions of the first and second kind, and corresponding derivatives. Special functions may require different series expansions for different ranges of the argument. We showcase a strategy for using the same neural network model over any interval within the domain of definition of the function, that would otherwise require different asymptotic expansion representations.