Noise in interferometric optical systems: an optical Nyquist theorem

The index of refraction of an optical medium at a temperature above absolute zero undergoes statistical fluctuations, which in turn introduce fluctuations in the phase of an optical signal propagating in the medium. The magnitude and spectral density of these phase fluctuations are calculated, and it is shown that they can be larger than the phase uncertainty due to quantum noise. Barring the onset of nonlinearities, these fluctuations and not quantum noise set the limit to the measurement of optical phase. >