Abstract A problem of interest in genetics is that of testing whether a mixture of two binomial distributions Bi(k, p) and B i (k, 1 2 ) is simply the pure distribution B i (k, 1 2 ) . This problem arises in determining whether we have a genetic marker for a gene responsible for a heterogeneous trait, that is a trait which is caused by any one of several genes. In that event we would have a nontrivial mixture involving 0 Standard asymptotic theory breaks down for such problems which belong to a class of problems where a natural parametrization represents a single distribution, under the hypothesis to be tested, by infinitely many possible parameter points. That difficulty may be eliminated by a transformation of parameters. But in that case a second problem appears. The regularity conditions demanded by the applicability of the Fisher Information fails when k > 2. We present an approach where use is made of the Kullback Leibler information, of which the Fisher information is a limiting case. Several versions of the binomial mixture problem are studied. The asymptotic analysis is supplemented by the results of simulations. It is shown that as n → ∞, the asymptotic distribution of twice the logarithm of the likelihood ratio corresponds to the square of the supremum of a Gaussian stochastic process with mean 0, variance 1 and a well behaved covariance function. As k → ∞ this limiting distribution grows stochastically as log k.