Optical solitary waves in the presence of a Lorentzian gain line: limitations of the Ginzburg-Landau model

Abstract The domain of validity of the Ginzburg–Landau equation used as a model for describing optical solitary waves in the presence of a distributed amplification is shown to be restricted to the small-excess-gain region. Our analysis, based instead on a complex Lorentzian gain model, suggests that, to a good approximation, the main effect of the amplifier is to select a particular N =1 soliton of the `classical' nonlinear Schrodinger equation. Analytical approximations accurately describe this process.