A Note on Estimation in Straight Line Regression When Both Variables Are Subject to Error
暂无分享,去创建一个
Abstract Let yi = α + βxi + eis xi = ξi + δis i = 1, 2, ···, N, N ≥ 3, where (eis δi)s i = 1, 2, ···, N are independent bivariate normal with zero means, variances σ e2 σδ2 and correlation ρ, the ξi being unknown constants at least two of which are distinct. Let τ = (N−1)−1 ΣN i=1 (ξi − ξ)2/Σ δ2 and let . A recent article by Richardson and Wu gives, among other things, the expected value and m.s. error of b, when ρ = 0. This note generalizes their results, using a different method, to the case of ρ ≠ 0.
[1] D. H. Richardson,et al. Least Squares and Grouping Method Estimators in the Errors in Variables Model , 1970 .
[2] J. D. Williams. Moments of the Ratio of the Mean Square Successive Difference to the Mean Square Difference in Samples From a Normal Universe , 1941 .
[3] R. A. Silverman,et al. Theory of Functions of a Complex Variable , 1968 .
[4] A. Madansky. The fitting of straight lines when both variables are subject to error , 1959 .