Abstract Ensemble predictions are an integral part of routine weather and climate prediction because of the sensitivity of such projections to the specification of the initial state. In many discussions it is tacitly assumed that ensembles are equivalent to probability distribution functions (p.d.f.s) of the random variables of interest. In general for vector valued random variables this is not the case (not even approximately) since practical ensembles do not adequately sample the high dimensional state spaces of dynamical systems of practical relevance. In this contribution we place these ideas on a rigorous footing using concepts derived from Bayesian analysis and information theory. In particular we show that ensembles must imply a coarse graining of state space and that this coarse graining implies loss of information relative to the converged p.d.f. To cope with the needed coarse graining in the context of practical applications, we introduce a hierarchy of entropic functionals. These measure the information content of multivariate marginal distributions of increasing order. For fully converged distributions (i.e. p.d.f.s) these functionals form a strictly ordered hierarchy. As one proceeds up the hierarchy with ensembles instead however, increasingly coarser partitions are required by the functionals which implies that the strict ordering of the p.d.f. based functionals breaks down. This breakdown is symptomatic of the necessarily limited sampling by practical ensembles of high dimensional state spaces and is unavoidable for most practical applications. In the second part of the paper the theoretical machinery developed above is applied to the practical problem of mid-latitude weather prediction. We show that the functionals derived in the first part all decline essentially linearly with time and there appears in fact to be a fairly well defined cut off time (roughly 45 days for the model analyzed) beyond which initial condition information is unimportant to statistical prediction.
[1]
S. Swain.
Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences
,
1984
.
[2]
Thomas M. Cover,et al.
Elements of Information Theory
,
2005
.
[3]
Irene A. Stegun,et al.
Handbook of Mathematical Functions.
,
1966
.
[4]
R. Kleeman.
Measuring Dynamical Prediction Utility Using Relative Entropy
,
2002
.
[5]
H. S. Green,et al.
Order-disorder phenomena
,
1964
.
[6]
Andrew J. Majda,et al.
A framework for predictability through relative entropy
,
2002
.
[7]
Andrew J. Majda,et al.
Predictability in a Model of Geophysical Turbulence
,
2005
.
[8]
C. W. Gardiner,et al.
Handbook of stochastic methods - for physics, chemistry and the natural sciences, Second Edition
,
1986,
Springer series in synergetics.
[9]
Lance M. Leslie,et al.
A new general circulation model: formulation and preliminary results in a single- and multi-processor environment
,
1996
.
[10]
Ludwig Boltzmann,et al.
Lectures on Gas Theory
,
1964
.
[11]
Andrew J. Majda,et al.
A mathematical framework for quantifying predictability through relative entropy
,
2002
.