Method of computing gradient vector and Jacobean matrix in arbitrarily connected neural networks

The paper shows that it fully connected neural networks are used then the same problem can be solved with less number of neurons and weights. Interestingly such networks are trained faster. The problem is that most of the neural networks terming algorithms are not suitable for such network. Presented algorithm and software allow training feedforwad neural networks with arbitrarily connected neurons in similar way as the SPICE program can analyze any circuit topology. When the second order algorithm is used (for which Jacobean must be calculated) solution is obtained about 100 times faster.