A soft-competitive splitting rule for adaptive tree-structured neural networks

An algorithm for generating tree structured neural networks using a soft-competitive recursive partitioning rule is described. It is demonstrated that this algorithm grows robust, honest estimators. Preliminary results on a 10-class, 240-dimensional optical character recognition classification task show that the tree outperforms backpropagation. Arguments are made that suggest why this should be the case. The connection of the soft-competitive splitting rule to the twoing rule is described.<<ETX>>