SUMMARY To define a likelihood we have to specify the form of distribution of the observations, but to define a quasi-likelihood function we need only specify a relation between the mean and variance of the observations and the quasi-likelihood can then be used for estimation. For a one-parameter exponential family the log likelihood is the same as the quasi-likelihood and it follows that assuming a one-paramet'er exponential family is the weakest sort of distributional assumption t'hat can be made. The Gauss-Newton method for calculating nonlinear least squares estimates generalizes easily to deal wit'h maximum quasi-likelihood estimates, and a rearrangement of this produces a generalization of the method described by Nelder & Wedderburn (1972). This paper is mainly concerned wit'h fitting regression models, linear or nonlinear, in which the variance of each observation is specified to be either equal to, or proportional to, some function of its expectat'ion. If the form of distribution of the observations were specified, t'he method of maximum likelihood would give estimates of t'he parameters in t'he model. For instance, if it is specified that the observations have normally distributed errors with constant variance, then the method of least squares provides expressions for the variances and covariances of the estimates, exact for linear models and approximate for nonlinear ones, and these estimates and the expressions for their errors remain valid even if the observations are not normally distributed but merely have a fixed variance; thus, with linear models and a given error variance, the variance of least squares estimat'es is not affected by the distribution of the errors, and the same holds approximately for nonlinear ones. A more general situation is considered in this paper, namely the situation when there is a given relation between the variance and mean of the observations, possibly with an unknown constant of proportionality. A similar problem was considered from a Bayesian viewpoint by Hartigan (1969).We define a quasi-likelihood function, which can be used for estimation in the same way as a likelihood function. With constant variance this again leads to least squares estimation. When other mean-variance relationships are specified, the quasilikelihood sometimes turns out to be a recognizable likelihood function; for instance, for a constant coefficient of variation the quasi-likelihood function is the same as the likelihood obtained by treating the observations as if they had a gamma distribution.
[1]
J. Tukey.
One Degree of Freedom for Non-Additivity
,
1949
.
[2]
R. A. Fisher.
A biological assay of tuberculins.
,
1949,
Biometrics.
[3]
E. Lehmann.
Testing Statistical Hypotheses
,
1960
.
[4]
K. W. Finlay,et al.
The analysis of adaptation in a plant-breeding programme
,
1963
.
[5]
M. Kendall,et al.
The advanced theory of statistics
,
1945
.
[6]
J. Hartigan.
Linear Bayesian Methods
,
1969
.
[7]
A. Gould,et al.
Probit Analysis, 3rd edition
,
1973
.