Implementing the minimum-misclassification-error energy function for target recognition

The authors demonstrate through an example that the minimum-misclassification-error (MME) classifier can dramatically outperform the sigmoid-least-mean-squares ( sigma -LMS) classifier. Three energy functions that are useful for classification goals other than simply minimizing the misclassification rate are proposed. First is a minimum-cost function, which allows different costs for misclassifications from different classes. Second is a Neyman-Pearson function, which minimizes the number of misclassifications for one class given a fixed misclassification rate for the other class. Last is a minimax function, which minimizes the maximum number of misclassifications when the a priori probabilities of each class are unknown. Unlike their classical classifier counterparts, these energy functions operate directly on a training set, and do not require that class probability distributions be known.<<ETX>>