Quantifying electron temperature distributions from time-integrated x-ray emission spectra.

K-shell x-ray emission spectroscopy is a standard tool used to diagnose the plasma conditions created in high-energy-density physics experiments. In the simplest approach, the emissivity-weighted average temperature of the plasma can be extracted by fitting an emission spectrum to a single temperature condition. It is known, however, that a range of plasma conditions can contribute to the measured spectra due to a combination of the evolution of the sample and spatial gradients. In this work, we define a parameterized model of the temperature distribution and use Markov Chain Monte Carlo sampling of the input parameters, yielding uncertainties in the fit parameters to assess the uniqueness of the inferred temperature distribution. We present the analysis of time-integrated S and Fe x-ray spectroscopic data from the Orion laser facility and demonstrate that while fitting each spectral region to a single temperature yields two different temperatures, both spectra can be fit simultaneously with a single temperature distribution. We find that fitting both spectral regions together requires a maximum temperature of 1310-70 +90 eV with significant contributions from temperatures down to 200 eV.