A Markov decision process model for capacity expansion and allocation

We present a finite-horizon Markov decision process (MDP) model for providing decision support in semiconductor manufacturing on such critical operational issues as when to add additional capacity and when to convert from one type of production to another.

[1]  Martin L. Puterman,et al.  Markov Decision Processes: Discrete Stochastic Dynamic Programming , 1994 .

[2]  Stuart Bermon,et al.  Capacity analysis of complex manufacturing facilities , 1995, Proceedings of 1995 34th IEEE Conference on Decision and Control.

[3]  Dimitri P. Bertsekas,et al.  Dynamic Programming and Optimal Control, Two Volume Set , 1995 .

[4]  J. Duley,et al.  Sense and sensibility: the scaleable minifab , 1997, 1997 IEEE International Symposium on Semiconductor Manufacturing Conference Proceedings (Cat. No.97CH36023).

[5]  S. C. Wood Cost and cycle time performance of fabs based on integrated single-wafer processing , 1997 .

[6]  David D. Yao,et al.  Capacity allocation in semiconductor fabrication , 1999, Proceedings of the 38th IEEE Conference on Decision and Control (Cat. No.99CH36304).