Optimal test length for multiple prediction: The general case

The concepts of differential prediction and multiple absolute prediction were developed in earlier papers [2, 3]. Methods for determining optimal distribution of testing time for each type of prediction are available [4, 5] and are appropriate for use provided that no altered time allotment approaches zero. In this article the methods developed in [4, 5] are extended to include cases where the altered time allotment for one or more tests may approach zero. The procedures developed are illustrated by numerical examples, after which the mathematical rationales are provided.