Information theory as a unifying statistical approach for use in marketing research

Abstract Information theory is shown to provide a unified approach to a wide range of problems in marketing research. For instance, this approach can be used to obtain characterizations parallel to those of the Hendry system and other entropic approaches with great economy of assumptions and with the added flexibility that constraints can be easily identified for explicit consideration and implemented as needed. Goodness-of-fit tests and decision modelling structures are supplied from these same stochastic models with a range of applications that include market segmentation and brand shifting choices. A basic approach to these and other procedures is therefore obtainable from information theoretic methods. These methods can be used to address stochastic model selection problems and other probabilistic models of marketing choice — for example, Minimum Discrimination Information (MDI) estimation, Logit, Multiplicative Competitive Interaction (MCI), and other important choice models are also shown in this paper to arise naturally from information theoretic formulations with duality relations developed by Charnes and Cooper providing additional simplifications and interpretations.

[1]  A. Charnes,et al.  An Extremal Principle for Accounting Balance of a Resource Value-Transfer Economy: Existence, Uniqueness and Computation , 1974 .

[2]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[3]  William D. Perreault,et al.  Validation of Discriminant Analysis in Marketing Research , 1977 .

[4]  A. Charnes,et al.  Constrained Kullback-Leibler Estimation; Generalized Cobb-Douglas Balance, and Unconstrained Convex Programming. , 1975 .

[5]  R. Fisher,et al.  The Logic of Inductive Inference , 1935 .

[6]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[7]  John Dadness Carter An entropy-based partitioning approach to the analysis of competition , 1975 .

[8]  S. Gull Bayesian Inductive Inference and Maximum Entropy , 1988 .

[9]  Vee Ming Ng,et al.  On the estimation of parametric density functions , 1980 .

[10]  A. Charnes,et al.  Management Models and Industrial Applications of Linear Programming , 1961 .

[11]  J. Berkson,et al.  Minimum discrimination information, the 'no interaction' problem, and the logistic function. , 1972, Biometrics.

[12]  Solomon Kullback,et al.  Information Theory and Statistics , 1960 .

[13]  Jan M. Van Campenhout,et al.  Maximum entropy and conditional probability , 1981, IEEE Trans. Inf. Theory.

[14]  John R. Hauser,et al.  Testing the Accuracy, Usefulness, and Significance of Probabilistic Choice Models: An Information-Theoretic Approach , 1978, Oper. Res..

[15]  E. T. Jaynes,et al.  How Does the Brain Do Plausible Reasoning , 1988 .

[16]  Wallace E. Larimore,et al.  Predictive inference, sufficiency, entropy and an asymptotic likelihood principle , 1983 .

[17]  Tae Hoon Oum,et al.  A Warning on the Use of Linear Logit Models in Transport Mode Choice Studies , 1979 .

[18]  A. Rényi On Measures of Entropy and Information , 1961 .

[19]  P. Brockett,et al.  An Information Theoretic Approach for Identifying Shared Information and Asymmetric Relationships Among Variables. , 1990, Multivariate behavioral research.

[20]  Rory A. Fisher,et al.  Contributions to mathematical statistics , 1951 .

[21]  Abraham Charnes,et al.  Constrained Information Theoretic Characterizations in Consumer Purchase Behaviour , 1978 .

[22]  S. Kullback,et al.  The Information in Contingency Tables , 1980 .

[23]  D. McFadden Conditional logit analysis of qualitative choice behavior , 1972 .

[24]  A. Charnes,et al.  An extremal and information-theoretic characterization of some interzonal transfer models , 1972 .

[25]  R. Shibata An optimal selection of regression variables , 1981 .

[26]  Abraham Charnes,et al.  An MDI Model and an Algorithm for Composite Hypotheses Testing and Estimation in Marketing , 1984 .

[27]  Abraham Charnes,et al.  The MDI Method as a Generalization of Logit, Probit and Hendry Analyses in Marketing. , 1980 .

[28]  Perry D. Haaland,et al.  A Characterization of Divergence with Applications to Questionnaire Information , 1979, Inf. Control..

[29]  P. Holland,et al.  Discrete Multivariate Analysis. , 1976 .

[30]  Lee G. Cooper,et al.  Technical Note---Simplified Estimation Procedures for MCI Models , 1982 .

[31]  H. Akaike A new look at the Bayes procedure , 1978 .

[32]  E. Parzen Unification of Statistical Methods for Continuous and Discrete Data , 1992 .

[33]  R. Green,et al.  Identification of Export Opportunities: A Shift-share Approach , 1985 .

[34]  G. D. Murray,et al.  NOTE ON ESTIMATION OF PROBABILITY DENSITY FUNCTIONS , 1977 .

[35]  Claude E. Shannon,et al.  The mathematical theory of communication , 1950 .

[36]  P. M. Williams Bayesian Conditionalisation and the Principle of Minimum Information , 1980, The British Journal for the Philosophy of Science.

[37]  Peter J. Danaher,et al.  A Log-Linear Model for Predicting Magazine Audiences , 1988 .

[38]  Abraham Charnes,et al.  M.D. I. Estimation via Unconstrained Convex Programming. , 1980 .

[39]  Aleksandr Yakovlevich Khinchin,et al.  Mathematical foundations of information theory , 1959 .

[40]  Silviu Guiaşu,et al.  Information theory with applications , 1977 .

[41]  J. Herniter An entropy model of brand purchase behavior , 1973 .

[42]  G. H. Haines,et al.  Predicting Demand for Residential Solar Heating: An Attribute Method , 1982 .

[43]  P. Zarembka Frontiers in econometrics , 1973 .

[44]  Paul E. Green,et al.  On the Analysis of Qualitative Data in Marketing Research , 1977 .

[45]  L Patrick,et al.  INFORMATION THEORETIC APPROACH TO ACTUARIAL SCIENCE: A UNIFICATION AND EXTENSION OF RELEVANT THEORY AND APPLICATIONS , 1991 .

[46]  Emanuel Parzen,et al.  Goodness of Fit Tests and Entropy , 1990 .

[47]  H. Akaike A Bayesian analysis of the minimum AIC procedure , 1978 .

[48]  William R. Dillon,et al.  Constrained Discrimination via MDI Estimation: The use of Additional Information in Segmentation Analysis , 1987 .

[49]  Boaz Golany,et al.  A Maximum-Entropy Based Heuristic for Density Estimation from Data in Histogram Form* , 1990 .

[50]  D. Spiegelhalter,et al.  Bayes Factors and Choice Criteria for Linear Models , 1980 .

[51]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[52]  D. B. Learner,et al.  Management Science and Marketing Management , 1985 .

[53]  G. Kitagawa,et al.  Akaike Information Criterion Statistics , 1988 .

[54]  M. Stone Comments on Model Selection Criteria of Akaike and Schwarz , 1979 .

[55]  A. Zellner Optimal Information Processing and Bayes's Theorem , 1988 .

[56]  H. Akaike,et al.  Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .

[57]  J. Herniter,et al.  A Comparison of the Entropy Model and the Hendry Model , 1974 .

[58]  Hirotugu Akaike,et al.  An extension of the method of maximum likelihood and the Stein's problem , 1977 .