During the past several years, there has been an increase in the development of data assimilation methods and their application. The first data assimilation applications were in meteorology, and now the data assimilation technique is an important component of numerical weather forecasting. The data revolution in oceanography is bringing the daily practice of physical oceanography closer to that of dynamic meteorology. There are now large observational data sets of temperature and salinity in the Atlantic from global projects such as WOCE, SECTIONS, TOGA, COARE, PIRATA, and others. Improved assimilation techniques are needed in order to have ocean models fully exploit the new observational facilities. Data assimilation improves the estimate of the ocean and atmospheric physical state by extracting as much information as possible from both the measurements and the dynamic model, combining them in an optimal way. Assimilation may be used to improve initial and/or boundary conditions and to evaluate poorly known model parameters. Data assimilation in ocean models has been ongoing in the scientific literature for approximately 30 years (for a review, see [8]). It is necessary to understand which processes can be realistically described by the ocean models and which is the role of measurements. Two extreme opinions exist. One considers that the ocean is like a quasi-stationary medium and that it is sufficient to make only a few measurements in order to have a good description of the ocean state. From this point of view, the numerical models are useless (or simply wrong) and data assimilation techniques are restricted to a statistical correction process. The second opinion is that the ocean is a highly turbulent fluid that does not have a memory about its previous states. In this case, information generated by measurements becomes meaningless in a short time interval, and it is necessary to continuously monitor the ocean. Thus, only well-designed resolution models would be able to describe real processes in the ocean and data assimilation would be useful only to obtain correct initial (and possible boundary) conditions. Reality, however, lies somewhere in between these two extreme points of view. Thus, the scientific question is how to represent this situation numerically. The answer depends on both the model and the data assimilation techniques as well as on the local dynamic. There are two general concepts which have been discussed for data assimilation. The first is the “variational/adjoint” method, which has been the most popular scheme (see, e.g., [22, 24, 28]). An example of this technique is to assume that the model initial (and/or boundary) conditions are unknown parameters and that there is observation data distributed over some time interval. The technique seeks the optimal initial conditions by comparing the model trajectory with the measurements with respect to some criteria. This can be formulated as a constrained minimization problem. The present paper does not deal with this class of methods so there is no need to go into the details of this problem. The other class of methods is the “sequential data assimilation.” Starting from some initial condition the model solution is sequentially updated whenever measurements are available. The model solution will approach the observed state under certain conditions. This group of methods requires an updating scheme, which combines the model solution and the measurements to find the “best” state estimate. The Kalman-filter method belongs to this class of data assimilation technique. The Kalman filter is derived in a number of books on control theory (e.g., [1, 11]). In oceanography, the Kalmanfilter has been used in [7, 16]. The main idea of this method is to solve dynamic equations using the error covariance matrix equation, where an error is the difference between the model and observation values. These applications have been presented in [9, 16] and are commonly used. In this paper, another method for the definition of the error covariance matrix is presented. It continues the studies published in our paper by K. Belyaev, S. Meyers, and J. J. O’Brien.
[1]
William Carlisle Thacker,et al.
Fitting models to inadequate data by enforcing spatial and temporal smoothness
,
1988
.
[2]
Detlev Müller,et al.
Bispectra of Sea-Surface Temperature Anomalies
,
1987
.
[3]
A. Jazwinski.
Stochastic Processes and Filtering Theory
,
1970
.
[4]
K. Bryan.
A Numerical Method for the Study of the Circulation of the World Ocean
,
1997
.
[5]
J. Doob.
Stochastic processes
,
1953
.
[6]
S. Levitus.
Climatological Atlas of the World Ocean
,
1982
.
[7]
Geir Evensen,et al.
Advanced Data Assimilation for Strongly Nonlinear Dynamics
,
1997
.
[8]
W. R. Holland,et al.
Data Constraints Applied to Models of the Ocean General Circulation. Part I: The Steady Case
,
1986
.
[9]
G. Evensen.
Using the Extended Kalman Filter with a Multilayer Quasi-Geostrophic Ocean Model
,
1992
.
[10]
J. Derber,et al.
Variational Data Assimilation with an Adiabatic Version of the NMC Spectral Model
,
1992
.
[11]
Josef M. Oberhuber,et al.
An Atlas Based on the COADS Data Set: the Budgets of Heat Buoyancy and Turbulent Kinetic Energy at t
,
1988
.
[12]
J. O'Brien,et al.
Variational data assimilation for determining the seasonal net surface heat flux using a tropical pacific ocean model
,
1995
.
[13]
Robert N. Miller,et al.
A Kalman Filter Analysis of Sea Level Height in the Tropical Pacific
,
1989
.
[14]
M. Ghil,et al.
Data assimilation in meteorology and oceanography
,
1991
.
[15]
Michael Ghil,et al.
Meteorological data assimilation for oceanographers. Part I: Description and theoretical framework☆
,
1989
.
[16]
H. H. Rachford,et al.
The Numerical Solution of Parabolic and Elliptic Differential Equations
,
1955
.