Estimation of the $k$th Derivative of a Distribution Function

(l) X~~~F(k)(x; h.) =(2h.) k_ko -) F(j)(k) where X1 = x + (k - 2j)hn. F,, denotes the empiric df based on XI, , X, and {h"} is a suitably chosen sequence of positive numbers converging to zero. When there is no danger of confusion we will omit the subscript on h. We assume throughout that k > 1. We investigate the consistency, asymptotic bias, variance, and mean square error of this estimator, and discuss the minimization of the latter through judicious choice of the sequence {hJ. This generalizes results of Rosenblatt (1956) who treated the case k = 1. Gaffey (1959) made use of the estimator (1) and essentially proved Theorem 2 of this paper. Schuster (1969) considered a different estimator for F,k)(X) for which he proved a.s. uniform convergence subject to