A Markov chain model of military personnel dynamics

Personnel retention is one of the most significant challenges faced by the US Army. Central to the problem is understanding the incentives of the stay-or-leave decision for military personnel. Using three years of data from the US Department of Defense, we construct and estimate a Markov chain model of military personnel. Unlike traditional classification approaches, such as logistic regression models, the Markov chain model allows us to describe military personnel dynamics over time and answer a number of managerially relevant questions. Building on the Markov chain model, we construct a finite-horizon stochastic dynamic programming model to study the monetary incentives of stay-or-leave decisions. The dynamic programming model computes the expected pay-off of staying versus leaving at different stages of the career of military personnel, depending on employment opportunities in the civilian sector. We show that the stay-or-leave decisions from the dynamic programming model possess surprisingly strong predictive power, without requiring personal characteristics that are typically employed in classification approaches. Furthermore, the results of the dynamic programming model can be used as an input in classification methods and lead to more accurate predictions. Overall, our work presents an interesting alternative to classification methods and paves the way for further investigations on personnel retention incentives.

[1]  L. Ivanitskaya,et al.  The Effectiveness of the Recent Army Captain Retention Program , 2011 .

[2]  Jun Wang,et al.  A Review of Operations Research Applications in Workforce Planning and Potential Modeling of Military Training , 2005 .

[3]  Peter G. Doyle,et al.  Random Walks and Electric Networks: REFERENCES , 1987 .

[4]  James Hosek,et al.  A New Tool for Assessing Workforce Management Policies Over Time , 2013 .

[5]  Morten I. Lau,et al.  Estimating Individual Discount Rates in Denmark: A Field Experiment , 2002 .

[6]  Douglas C. Montgomery,et al.  Applied Statistics and Probability for Engineers, Third edition , 1994 .

[7]  Charles A Henning Military Retirement Reform: A Review of Proposals and Options for Congress , 2011 .

[8]  P. Poornachandran Rao A Dynamic Programming Approach to Determine Optimal Manpower Recruitment Policies , 1990 .

[9]  J. T. Warner,et al.  The Personal Discount Rate: Evidence from Military Downsizing Programs , 2001 .

[10]  Richard M. Cyert,et al.  Estimation of the Allowance for Doubtful Accounts by Markov Chains , 1962 .

[11]  S. Pauker,et al.  The Markov Process in Medical Prognosis , 1983, Medical decision making : an international journal of the Society for Medical Decision Making.

[12]  B. Craig,et al.  Estimation of the transition matrix of a discrete-time Markov chain. , 2002, Health economics.

[13]  M. Jeeva,et al.  A dynamic programming approach to optimal manpower recruitment and promotion policies for the two grade system , 2010 .

[14]  B. Efron,et al.  Bootstrap confidence intervals , 1996 .

[15]  Glenn A. Gotz,et al.  Sequential Analysis of the Stay/Leave Decision: U.S. Air Force Officers , 1983 .

[16]  Joel Sokol,et al.  A logistic regression/Markov chain model for NCAA basketball , 2006 .

[17]  Jeremy Arkes,et al.  The Dynamic Retention Model for Air Force Officers , 2007 .

[18]  Beth J. Asch,et al.  A Theory of Compensation and Personnel Policy in Hierarchical Organizations with Application to the United States Military , 2001, Journal of Labor Economics.

[19]  G. Styan Hadamard products and multivariate statistical analysis , 1973 .

[20]  James Hosek,et al.  Assessing Compensation Reform: Research in Support of the 10th Quadrennial Review of Military Compensation , 2008 .

[21]  Sheldon M. Ross,et al.  Stochastic Processes , 2018, Gauge Integral Structures for Stochastic Calculus and Quantum Electrodynamics.

[22]  Martin L. Puterman,et al.  Markov Decision Processes: Discrete Stochastic Dynamic Programming , 1994 .

[23]  Richard Bellman,et al.  Introduction to Matrix Analysis , 1972 .

[24]  Kiyosi Itô,et al.  Essentials of Stochastic Processes , 2006 .

[25]  R. Kevin Wood,et al.  Setting military reenlistment bonuses , 1993 .

[26]  P. Pfeifer,et al.  Modeling customer relationships as Markov chains , 2000 .

[27]  Andrew O. Hall Simulating and Optimizing: Military Manpower Modeling and Mountain Range Options , 2009 .

[28]  Robert A. Moffitt,et al.  Estimating Dynamic Models of Quit Behavior: The Case of Military Reenlistment , 1995, Journal of Labor Economics.

[29]  T. W. Anderson,et al.  Statistical Inference about Markov Chains , 1957 .

[30]  James Hosek,et al.  Reserve Participation and Cost Under a New Approach to Reserve Compensation , 2012 .

[31]  James Hosek,et al.  Attracting the Best: How the Military Competes for Information Technology Personnel , 2004 .

[32]  James P. Kelly,et al.  Annals of Optimization Strategic Workforce Optimization : Ensuring Workforce Readiness with OptForce , 2013 .

[33]  Robert V. Brill,et al.  Applied Statistics and Probability for Engineers , 2004, Technometrics.

[34]  Robert Tibshirani,et al.  Bootstrap Methods for Standard Errors, Confidence Intervals, and Other Measures of Statistical Accuracy , 1986 .

[35]  W. D. Cairns,et al.  THE MATHEMATICAL ASSOCIATION OF AMERICA. , 1918, Science.

[36]  John T Warner,et al.  The Influence of Non-Pecuniary Factors on Labor Supply. , 1981 .

[37]  Richard Buddin Success of First-Term Soldiers. The Effects of Recruiting Practices and Recruit Characteristics , 2005 .

[38]  Douglas C. Montgomery,et al.  Applied statistics and probability for engineers / Douglas C. Montgomery, George C. Runger , 2003 .