Information processing and Bayesian analysis

Abstract Science involves learning from data. Herein this process of learning or information processing is considered within the context of optimal information processing, as in Zellner (1988,1991,1997). Information criterion functionals are formulated and optimized to provide optimal information processing rules, one of which is Bayes’ theorem. By varying the inputs and using alternative side conditions, various optimal information processing rules are derived and evaluated. Generally output information = input information for these rules and thus they are 100% efficient learning rules. When different weights or costs are associated with alternative inputs, “anchoring” like effects, much emphasized in the psychological literature are the results of optimal information processing procedures. Further, dynamic information processing results are reviewed and extensions noted. Last, some implications of the information processing approach for learning from data will be discussed.

[1]  N. M. van Dijk,et al.  To Pool or Not to pool , 2000 .

[2]  R. Hogarth,et al.  Order effects in belief updating: The belief-adjustment model , 1992, Cognitive Psychology.

[3]  Arnold Zellner,et al.  Further Results on Bayesian Method of Moments Analysis of the Multiple Regression Model , 2001 .

[4]  L. M. M.-T. Theory of Probability , 1929, Nature.

[5]  Thomas B. Fomby,et al.  Applying maximum entropy to econometric problems , 1997 .

[6]  Arnold Zellner Bayesian Method of Moments (BMOM) Analysis of Mean and Regression Models , 1996 .

[7]  Arnold Zellner,et al.  Bayesian Methods and Entropy in Economics and Econometrics , 1991 .

[8]  K. Chaloner,et al.  Bayesian analysis in statistics and econometrics : essays in honor of Arnold Zellner , 1996 .

[9]  E. T. Jaynes,et al.  [Optimal Information Processing and Bayes's Theorem]: Comment , 1988 .

[10]  Ehsan S. Soofi,et al.  Information Theory and Bayesian Statistics , 1996 .

[11]  Bruce M. Hill [Optimal Information Processing and Bayes's Theorem]: Comment , 1988 .

[12]  Arnold Zellner,et al.  Bayesian regression diagnostics with applications to international consumption and income data , 1985 .

[13]  Daniel Gianola,et al.  An Introduction to Bayesian Inference , 2002 .

[14]  Haji Y. Izan To pool or not to pool?: A reexamination of Tobin's food demand problem , 1980 .

[15]  Inferring the Nutrient Content of Food With Prior Information , 1999 .

[16]  John Skilling,et al.  Maximum Entropy and Bayesian Methods , 1989 .

[17]  Douglas J. Miller,et al.  Maximum entropy econometrics: robust estimation with limited data , 1996 .

[18]  Arnold Zellner,et al.  Bayesian Analysis in Econometrics and Statistics: The Zellner View and Papers , 1997 .

[19]  Arnold Zellner,et al.  THE BAYESIAN METHOD OF MOMENTS (BMOM) , 1997 .

[20]  James Durbin,et al.  Errors in variables , 1954 .

[21]  A. Zellner Optimal Information Processing and Bayes's Theorem , 1988 .

[22]  J. Jaffray,et al.  Rational Behavior under Complete Ignorance , 1980 .

[23]  A. Zellner An Introduction to Bayesian Inference in Econometrics , 1971 .

[24]  E. Soofi INFORMATION THEORETIC REGRESSION METHODS , 1997 .

[25]  William E. Strawderman,et al.  A Bayesian growth and yield model for slash pine plantations , 1996 .

[26]  Kenneth J. Arrow,et al.  Studies in Resource Allocation Processes: Appendix: An optimality criterion for decision-making under ignorance , 1977 .