Maintenance Planning Under Uncertainties Using a Continuous-State POMDP Framework

Planning under uncertainty is an area that has attracted significant attention in recent years. Partially Observable Markov Decision Process (POMDP) is a sequential decision making framework particularly suited for tackling this problem. POMDP has this far mainly been used in robotics for a discrete-state formulation. Only few authors have dealt with the solution of the continuous-state POMDPs. This paper introduces the concept of approximating the continuous state using a mixture of Gaussians in order to render this methodology suitable for the problem of optimal maintenance planning in civil structures. Presently, a large part of existing infrastructure is reaching the end of its expected lifespan. The POMDP framework is used herein in order to take deterioration processes into account and to accordingly plan the optimal maintenance strategy for the remaining lifespan. The capabilities of the method are demonstrated through an example application on a bridge structure.

[1]  Daniel Straub,et al.  Stochastic Modeling of Deterioration Processes through Dynamic Bayesian Networks , 2009 .

[2]  Marina Meila,et al.  An Experimental Comparison of Model-Based Clustering Methods , 2004, Machine Learning.

[3]  Craig Boutilier,et al.  Value-Directed Compression of POMDPs , 2002, NIPS.

[4]  Xi-Ren Cao,et al.  Event-Based Optimization for POMDPs and Its Application in Portfolio Management , 2011 .

[5]  Jacob Goldberger,et al.  Hierarchical Clustering of a Mixture Model , 2004, NIPS.

[6]  Jeffrey K. Uhlmann,et al.  New extension of the Kalman filter to nonlinear systems , 1997, Defense, Security, and Sensing.

[7]  Nikos A. Vlassis,et al.  Robot Planning in Partially Observable Continuous Domains , 2005, BNAIC.

[8]  Alex Pentland,et al.  Active gesture recognition using partially observable Markov decision processes , 1996, Proceedings of 13th International Conference on Pattern Recognition.

[9]  Edward J. Sondik,et al.  The optimal control of par-tially observable Markov processes , 1971 .

[10]  Michael Havbro Faber,et al.  Constrained optimization of component reliabilities in complex systems , 2009 .

[11]  Ross B. Corotis,et al.  INSPECTION, MAINTENANCE, AND REPAIR WITH PARTIAL OBSERVABILITY , 1995 .

[12]  William S. Lovejoy,et al.  Some Monotonicity Results for Partially Observed Markov Decision Processes , 1987, Oper. Res..

[13]  Palle Thoft-Christensen,et al.  Optimal strategy for inspection and repair of structural systems , 1987 .

[14]  Reid G. Simmons,et al.  Probabilistic Robot Navigation in Partially Observable Environments , 1995, IJCAI.

[15]  Achintya Haldar,et al.  Bridge fatigue damage evaluation and updating using non-destructive inspections , 1996 .

[16]  Richard D. Smallwood,et al.  The analysis of economic teaching strategies for a simple learning model , 1971 .

[17]  Daniel Straub,et al.  Unified Approach to Risk-Based Inspection Planning for Offshore Production Facilities , 2003 .

[18]  Joelle Pineau,et al.  Point-based value iteration: An anytime algorithm for POMDPs , 2003, IJCAI.

[19]  Leslie Pack Kaelbling,et al.  Acting under uncertainty: discrete Bayesian models for mobile-robot navigation , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.