A Comparison of Ballistic Resistance Testing Techniques in the Department of Defense

Ballistic resistance testing is conducted in the Department of Defense (DoD) to estimate the probability that a projectile will perforate the armor of a system under test. Ballistic resistance testing routinely employs sensitivity experiment techniques where sequential test designs are used to estimate a particular quantile of the probability of perforation. Statistical procedures used to estimate the ballistic resistance of armor in the DoD have remained relatively unchanged for decades. In the current fiscal atmosphere of sequestration and budget deficits, efficiency is critical for test and evaluation. In this paper, we review and compare sequential methods, estimators, and stopping criteria used in the DoD to those found in literature. Using Monte Carlo simulation, we find that the three-phase optimal design, a probit model, and a break separation stopping criteria are most accurate and efficient at estimating V50, while the three-phase optimal design or Robbins-Monroe-Joseph method should be used to estimate V10.