Dynamic Workflow Composition: Using Markov Decision Processes

The advent of Web services has made automated workflow composition relevant to Web-based applications. One technique that has received some attention for automatically composing workflows is AI-based classical planning. However, workflows generated by classical planning algorithms suffer from the paradoxical assumption of deterministic behavior of Web services, then requiring the additional overhead of execution monitoring to recover from unexpected behavior of services due to service failures, and the dynamic nature of real-world environments. To address these concerns, we propose using Markov decision processes (MDPs) to model workflow composition. To account for the uncertainty over the true environmental model, and for dynamic environments, we interleave MDP-based workflow generation and Bayesian model learning. Consequently, our method models both the inherent stochastic nature of Web services and the dynamic nature of the environment. Our algorithm produces workflows that are robust to non-deterministic behaviors of Web services and that adapt to a changing environment. We use a supply chain scenario to demonstrate our method and provide empirical results.