An explanation of generalized profile likelihoods
暂无分享,去创建一个
Let X, T, Y be random vectors such that the distribution of Y conditional on covariates partitioned into the vectors X = x and T = t is given by f(y; x, φ), where φ = (θ, η(t)). Here θ is a parameter vector and η(t) is a smooth, real–valued function of t. The joint distribution of X and T is assumed to be independent of θ and η. This semiparametric model is called conditionally parametric because the conditional distribution f(y; x, φ) of Y given X = x, T = t is parameterized by a finite dimensional parameter φ = (θ, η(t)). Severini and Wong (1992. Annals of Statistics 20: 1768–1802) show how to estimate θ and η(·) using generalized profile likelihoods, and they also provide a review of the literature on generalized profile likelihoods. Under specified regularity conditions, they derive an asymptotically efficient estimator of θ and a uniformly consistent estimator of η(·). The purpose of this paper is to provide a short tutorial for this method of estimation under a likelihood–based model, reviewing results from Stein (1956. Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, University of California Press, Berkeley, pp. 187–196), Severini (1987. Ph.D Thesis, The University of Chicago, Department of Statistics, Chicago, Illinois), and Severini and Wong (op. cit.).
[1] W. Wong,et al. Profile Likelihood and Conditionally Parametric Models , 1992 .
[2] J. Staniswalis. The Kernel Estimate of a Regression Function in Likelihood-Based Models , 1989 .
[3] C. Stein. Efficient Nonparametric Testing and Estimation , 1956 .
[4] P. Speckman. Kernel smoothing in partial linear models , 1988 .