FOR DEPENDENT RANDOM VARIABLES
暂无分享,去创建一个
2. A convergence theorem for independent systems. Let each Xnk have mean Yink and variance a 2 which we shall assume exists. Let S - Zkn = Xk k and (2 = Ekn1 o 2 If for each n, xni Xn2 .. Xnkn are independent we say that (Xnk) is an independent system. We write Y?(Sn) -? Y(X) if Fn(x), the distribution function of S, converges to F(x), the distribution function of X, at each continuity point of X. We write Y(X) = Y( Y) when X and Y have the same distribution. It is well known that if a random variable is infinitely divisible and has finite variance it can be represented by the formula of Kolmogorov [2] with unique real constant c and bounded nondecreasing function K(u) which is right continuous and K(- oo) = 0. (Henceforth, we shall call a function with these properties a Kolmogorov function). Also it is known [2] that if (Xnk) is an independent system of random variables having finite variances and such that (Xnk - lnk) iS infinitesimal, then ?(Sn) -+ Y(X) if there is a Kolmogorov function K(u) and a constant c such that as n -x oo