A Generalizable Formulation of Conditional Logit with Diagnostics
暂无分享,去创建一个
[1] Rodney W. Johnson,et al. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.
[2] Ehsan S. Soofi,et al. Effects of collinearity on information about regression coefficients , 1990 .
[3] Nader Ebrahimi,et al. Relative information loss under Type II censored exponential data , 1990 .
[4] W. Kruskal. Relative Importance by Averaging Over Orderings , 1987 .
[5] D. Gensch. A Two-Stage Disaggregate Attribute Choice Model , 1987 .
[6] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[7] John R. Hauser,et al. Testing the Accuracy, Usefulness, and Significance of Probabilistic Choice Models: An Information-Theoretic Approach , 1978, Oper. Res..
[8] Henri Theil,et al. A Multinomial Extension of the Linear Logit Model , 1969 .
[9] H. Joe. Relative Entropy Measures of Multivariate Dependence , 1989 .
[10] H. Theil. On the Estimation of Relationships Involving Qualitative Variables , 1970, American Journal of Sociology.
[11] D. McFadden. Conditional logit analysis of qualitative choice behavior , 1972 .
[12] E. Jaynes. On the rationale of maximum-entropy methods , 1982, Proceedings of the IEEE.
[13] R. Dolan. Quantity Discounts: Managerial Issues and Research Opportunities , 1987 .
[14] S. Kullback,et al. The Information in Contingency Tables , 1980 .
[15] Aleksandr Yakovlevich Khinchin,et al. Mathematical foundations of information theory , 1959 .
[16] Henri Theil,et al. Information-Theoretic Measures of Fit for Univariate and Multivariate Linear Regressions , 1988 .