Compensation of polarization mode dispersion by means of the Kerr effect for nonreturn-to-zero signals.

We demonstrate numerically that a compensation of the polarization mode dispersion can be observed for nonreturn-to-zero signals as a result of a trapping effect, in analogy to the well-known soliton behavior. Conditions for such compensation are shown, and a comparison with the soliton case is reported.