Neuro-Fuzzy Classification and Regression Trees

The terminal nodes of a binary tree classifier represent discrete classes to be recognized. In this paper the classes are considered to be fuzzy sets in which a specific sample can belong to more than one class with different degrees of membership. The terminal nodes is this case will contain information about the degrees to which test samples belong to particular classes. This will allow the development of a regression tree in which a continuous output value such as the control signal of a fuzzy controller can be learned. In addition to the classes being fuzzy sets each node of the regression tree is made fuzzy by associating a membership function with the fuzzy sets feature_value < threshold and feature_value >= threshold. The output value is found by dropping the input measurement vector through the tree in which it will, in general, take both paths at each node with a weighting factor determined by the node membership functions. The crisp output value (defuzzification) is a weighted sum of the class values associated with the terminal nodes. The splitting criteria for each tree node is based on the use of a fuzzy cumulative distribution function, which is a generalization of the Kolmogorov-Smirnov (K-S) distance suitable for multiple classes. The splitting of nodes is terminated when all training samples belonging to a given node have their maximum degree of membership associated with a given class. Large decision trees are typically pruned to provide better classification accuracy when used with test data. A stock market prediction example is used to show that making a large fuzzy tree is an attractive alternative to pruning. Fuzzy classification and regression trees can be considered to be a fuzzy neural network in which the structure of the network is learned rather than the weights. Such neuro-fuzzy classification and regression trees should lend themselves to efficient implementation in a VLSI chip in which each test sample can propagate through all paths simultaneously.