Exploring Expected Utility Surfaces by Markov Chains

In this paper, we present a probability model and a Markov Chain sampler for exploring expected utility surfaces in decision theory and optimal design problems. The overall goal is to exploit Markov chain techniques to address computationally intensive decision problems. Technically, we propose to generate a sample of decisions by constructing a probability model on both the problem's unknowns and the decision variables. We achieve this by augmenting the given probability model in such a way that the marginal distribution of the sampled decisions is proportional to expected utility. Analyzing the sampled decisions provides guidance to decision making. This approach has potential application to a wide class of design and decision problems. We illustrate it with applications to the design of a screening trial and to a k-armed bandit problem in clinical trials.