End of Discriminant Functions based on Variance-covariance Matrices

Fisher proposes Fisher’s linear discriminant function (LDF) based on maximization of variance ratio. If data satisfies Fisher’s assumption, the same LDF is easily derived by variance-covariance matrix. If variance-covariance matrices of two classes are not same, quadratic discriminant function (QDF) is proposed. These discriminant functions have three problems. First, all discriminant functions except for Revised IP-OLDF can’t discriminate the cases xi on the discriminant hyper plane correctly (the unresolved problem of discriminant analysis). Only Revised IP-OLDF can solve this problem in theoretically. Second, LDF and QDF can’t recognize linear separable data in many cases, and the numbers of misclassifications (NMs) of these functions are usually higher than logistic regression that is used in many users. Third, these are not obtained if the values of some independent variable are constant, because inverse matrix can’t be calculated. JMP prepare the regularized discriminant function if QDF can’t analyze the difficult data. These facts mean that LDF and QDF shouldn’t be used for an important discrimination. On the contrary, Revised IP-OLDF based on Minimum Number of Misclassifications (MNM) criterion resolves these problems. In addition to this, means of error rates of Revised IP-OLDF are less than those of LDF and logistic regression and S-SVM by 100-fold cross validation in many cases.

[1]  H. Riedwyl,et al.  Multivariate Statistics: A Practical Approach , 1988 .

[2]  David Casasent,et al.  Optimal Linear Discriminant Functions , 1985, Other Conferences.

[3]  R. Fisher THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS , 1936 .

[4]  M. R. Mickey,et al.  Estimation of Error Rates in Discriminant Analysis , 1968 .

[5]  John M. Liittschwager,et al.  Integer Programming Solution of a Classification Problem , 1978 .

[6]  R. E. Warmack,et al.  An Algorithm for the Optimal Solution of Linear Inequalities and its Application to Pattern Recognition , 1973, IEEE Transactions on Computers.

[7]  Syke Stein,et al.  The Scientific Press , 2011 .

[8]  Shuichi Shinmura Evaluation of Optimal Linear Discriminant Function by 100-fold cross validation , 2013 .

[9]  J. P. Sall SAS regression applications , 1981 .

[10]  B. Efron Bootstrap Methods: Another Look at the Jackknife , 1979 .

[11]  Toshihide Ibaraki,et al.  Adaptive Linear Classifier by Linear Programming , 1970, IEEE Trans. Syst. Sci. Cybern..

[12]  L. Schrage Optimization Modeling With LINDO , 1997 .

[13]  J. Friedman Regularized Discriminant Analysis , 1989 .

[14]  A. Stuart,et al.  Portfolio Selection: Efficient Diversification of Investments , 1959 .

[15]  Antonie Stam,et al.  Nontraditional approaches to statistical classification: Some perspectives on L_p-norm methods , 1997, Ann. Oper. Res..

[16]  Bernhard N Flury Multivariate Statistics: A Practical Approach , 1988 .

[17]  Shuichi Shinmura,et al.  Optimal linear discriminant functions and their application , 1979, COMPSAC.

[18]  Beyond Fisher’s Linear Discriminant Analysis-New World of Discriminant Analysis - , 2011 .

[19]  S. Shinmura,et al.  A NEW ALGORITHM OF THE LINEAR DISCRIMINANT FUNCTION USING INTEGER PROGRAMMING , 2000 .

[20]  新村 秀一 Optimal Linear Discriminant Functions using Mathematical Programming , 2000 .

[21]  Shuichi Shinmura Problems of Discriminant Analysis by Mark Sense Test Data , 2011 .

[22]  Linus E. Schrage LINDO : an optimization modeling system , 1991 .