A Generalizable Formulation of Conditional Logit with Diagnostics

Abstract The conditional logit model is a multinomial logit model that permits the inclusion of choice-specific attributes. This article shows that the conditional logit model will maximize entropy given a set of attribute-value preserving constraints. A correspondence between the maximum entropy (ME) and maximum likelihood (ML) estimates for logit probabilities is established. Some easily computable and useful diagnostics for logit analysis are provided, and it is shown that an evaluation of the relative importance of attributes can be made using the ME formulation. The ME formulation is also generalized to accommodate initial choice probabilities into the logit model. An example is given. KEY WORDS: Choice models; Entropy; Kullback-Leibler discrimination information function; Relative importance.

[1]  Rodney W. Johnson,et al.  Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.

[2]  Ehsan S. Soofi,et al.  Effects of collinearity on information about regression coefficients , 1990 .

[3]  Nader Ebrahimi,et al.  Relative information loss under Type II censored exponential data , 1990 .

[4]  W. Kruskal Relative Importance by Averaging Over Orderings , 1987 .

[5]  D. Gensch A Two-Stage Disaggregate Attribute Choice Model , 1987 .

[6]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[7]  John R. Hauser,et al.  Testing the Accuracy, Usefulness, and Significance of Probabilistic Choice Models: An Information-Theoretic Approach , 1978, Oper. Res..

[8]  Henri Theil,et al.  A Multinomial Extension of the Linear Logit Model , 1969 .

[9]  H. Joe Relative Entropy Measures of Multivariate Dependence , 1989 .

[10]  H. Theil On the Estimation of Relationships Involving Qualitative Variables , 1970, American Journal of Sociology.

[11]  D. McFadden Conditional logit analysis of qualitative choice behavior , 1972 .

[12]  E. Jaynes On the rationale of maximum-entropy methods , 1982, Proceedings of the IEEE.

[13]  R. Dolan Quantity Discounts: Managerial Issues and Research Opportunities , 1987 .

[14]  S. Kullback,et al.  The Information in Contingency Tables , 1980 .

[15]  Aleksandr Yakovlevich Khinchin,et al.  Mathematical foundations of information theory , 1959 .

[16]  Henri Theil,et al.  Information-Theoretic Measures of Fit for Univariate and Multivariate Linear Regressions , 1988 .