Why use Signal-To-Noise as a Measure of MS Performance When it is Often Meaningless ?

The signal-to-noise of a chromatographic peak determined from a single measurement has served as a convenient figure of merit used to compare the performance of two different MS systems. The evolution in the design of mass spectrometry instrumentation has resulted in very low noise systems that have made the comparison of performance based upon signal-to-noise increasingly difficult, and in some modes of operation impossible. This is especially true when using ultra-low noise modes such as high resolution mass spectrometry or tandem MS; where there are often no ions in the background and the noise is essentially zero. This occurs when analyzing clean standards used to establish the instrument specifications. Statistical methodology commonly used to establish method detection limits for trace analysis in complex matrices is a means of characterizing instrument performance that is rigorously valid for both high and low background noise conditions. Instrument manufacturers should start to provide customers an alternative performance metric in the form of instrument detection limits based on relative the standard deviation of replicate injections to allow analysts a practical means of evaluating an MS system. Authors Greg Wells, Harry Prest, and Charles William Russ IV, Agilent Technologies, Inc. 2850 Centerville Road Wilmington, DE 19809-1610 USA