Kshirsagar-Type Lower Bounds for Mean Squared Error of Prediction

Let Y be an observable random vector and Z be an unobserved random variable with joint density f(y, z | θ), where θ is an unknown parameter vector. Considering the problem of predicting Z based on Y, we derive Kshirsagar type lower bounds for the mean squared error of any predictor of Z. These bounds do not require the regularity conditions of Bhattacharyya bounds and hence are more widely applicable. Moreover, the new bounds are shown to be sharper than the corresponding Bhattacharyya bounds. The conditions for attaining the new lower bounds are useful for easy derivation of best unbiased predictors, which we illustrate with some examples.