On error backpropagation algorithm using absolute error function

We propose error backpropagation using the absolute error function as an objective function. The error backpropagation is the most popular learning algorithm for multi-layered neural networks. In the error backpropagation, the square error function is usually used as the objective function. But a square function has a drawback in which it is enormously large if the data set includes a few anomalous data, which may be observational errors. On the other hand, the absolute error function is less affected by such data. But since the absolute error function is not differentiable, the standard backpropagation's way cannot be applied directly by using an absolute error function as an objective function. Therefore, we first introduce differentiable approximate functions for the absolute value function. The purpose of introducing an approximate function is to construct a differentiable error function which is close to the absolute error function. Then we propose an error backpropagation algorithm minimizing a differentiable approximate error function. Some computational experiments indicate that the proposed method is practically efficient. In particular, it is observed that the method is more robust and learns faster than the backpropagation with the square error function when teacher signals include some incorrect data.