Consider the problem of estimating μ, based on the observation of Y0,Y1,...,Yn, where it is assumed only that Y0,Y1,...,YκiidN(μ,σ2) for some unknown κ. Unlike the traditional change-point problem, the focus here is not on estimating κ, which is now a nuisance parameter. When it is known that κ=k, the sample mean Y ̄k=∑k0Yi/(k+1), provides, in addition to wonderful efficiency properties, safety in the sense that it is minimax under squared error loss. Unfortunately, this safety breaks down when κ is unknown; indeed if k>κ, the risk of Y ̄k is unbounded. To address this problem, a generalized minimax criterion is considered whereby each estimator is evaluated by its maximum risk under Y0,Y1,...,YκiidN(μ,σ2) for each possible value of κ. An essentially complete class under this criterion is obtained. Generalizations to other situations such as variance estimation are illustrated.
[1]
A. F. Smith.
A Bayesian approach to inference about a change-point in a sequence of random variables
,
1975
.
[2]
P. Billingsley,et al.
Convergence of Probability Measures
,
1969
.
[3]
K. French,et al.
Expected stock returns and volatility
,
1987
.
[4]
James V. Bondar,et al.
Amenability: A survey for statistical applications of hunt-stein and related conditions on groups
,
1981
.
[5]
Adrian F. M. Smith.
Change-Point problems: approaches and applications
,
1980
.
[6]
Frederick Mosteller,et al.
On Pooling Data
,
1948
.
[7]
D. Siegmund.
Boundary Crossing Probabilities and Statistical Applications
,
1986
.
[8]
J. Berger.
Statistical Decision Theory and Bayesian Analysis
,
1988
.