The Institute for Mathematics and its Applications (IMA) at the University of Minnesota hosted the workshop, ‘‘Uncertainty Quantification in Materials Modeling’’ on December 16–17, 2013. The workshop convened 30 scientists specializing in materials modeling and simulation and researchers in uncertainty quantification (UQ) with the goal of identifying challenges in UQ for materials applications. The following discussion is a summary of workshop findings as well as an invitation to participate in an ambitious follow-up activity tentatively scheduled for summer 2015.* The workshop objectives may be better understood in the context of the Materials Genome Initiative (MGI), which was formally announced by President Barack Obama in June 2011. This ambitious multi-agency program seeks to accelerate the introduction of novel materials into industrial products, striving to reduce by factors of two both the time and cost from discovery to deployment. The initiative proposes creation of a new Materials Innovation Infrastructure as the principal means to achieve this end. This infrastructure will consist of an integrated collection of experimental data and simulation software. These two sources of materials information—experiment and computation—will be seamlessly integrated using data informatics tools tailored for material science communities. The result will be a set of shared, community resources facilitating rapid access to data and numerical simulations on demand. While all elements of this innovation infrastructure will require investments for their research and development, the IMA workshop focused on the role of numerical simulation in material science and engineering. It is natural to expect that increased reliance on computational tools to design and assess materials in silico will enable rapid and relatively inexpensive exploration of design possibilities. That being said, critical assessment of simulation outputs is a prerequisite to promote understanding and trust in the capabilities of simulation for industrial decision making. Uncertainty quantification is an umbrella term that refers to the diverse analysis methods and tools suitable for critical assessment of models and simulations. Elements of UQ analysis include verification and validation (V&V), sensitivity analysis, and uncertainty propagation. For completeness we briefly define these terms here and refer to the recent report by the National Research Council and the numerous references therein for more detailed discussion. Verification is defined as an analysis of the quantitative correspondence between a conceptual model and its numerical implementation. Verification is exercised primarily in the computational domain and may be viewed as a modern descendant of classic numerical analysis. This term is not to be confused with validation, which is defined as an analysis of the quantitative correspondence between a numerical simulation and the desired quantities of interest in a physical experiment. Here critical attention must be paid to analysis and quantification of systemic errors introduced by selection of a specific model and the associated reductions and simplifications in the physics that the model represents in comparison with the referent physical process. Colloquially speaking, it is here that one attempts to quantify the ‘‘unknown unknowns.’’ In general, it is very difficult to quantify such errors. An important area of research in validation is to assess how confidence in the prediction of a model degrades while interpolating between or extrapolating from conditions where *This work is a contribution of the National Institute of Standards and Technology and is not subject to copyright in the United States. JOM, Vol. 66, No. 7, 2014