An information theoretic analysis of sequential decision-making

We provide a novel analysis of Wald's sequential probability ratio test based on information theoretic measures for symmetric thresholds, symmetric noise, and equally likely hypotheses. This test is optimal in the sense that it yields the minimum mean decision time. To analyze the decision-making process we consider information densities, which represent the stochastic information content of the observations yielding a stochastic termination time of the test. Based on this, we show that the conditional probability to decide for hypothesis H1 (or the counter-hypothesis H0) given that the test terminates at time instant k is independent of time k. An analogous property has been found for a continuous-time first passage problem with two absorbing boundaries in the contexts of non-equilibrium statistical physics and communication theory. Moreover, we study the evolution of the mutual information between the binary variable to be tested and the output of the Wald test. Notably, we show that the decision time of the Wald test contains no information on which hypothesis is true beyond the decision outcome.