Statistical simulation of SSO noise in multi-gigabit systems

The use of deterministic techniques to evaluate the impact of simultaneous switching output (SSO) noise on the performance of modern highspeed systems with tight timing budget can be pessimistic. These can lead to conservative design, especially, in multi-gigabit systems with embedded coding or scrambling sublayer. To overcome the shortcomings of conventional methodologies, a statistical simulation method of evaluating the impact of SSO noise on high-speed single-ended signaling systems is presented. The method correctly considers the spatial and temporal distributions of switching activities of devices in the system to calculate the performance degradation of the interface due to power supply noise. First, transient behavior that describes the power supply noise coupling to the signal receiver is generated. Then, using the probability distributions of the switching activities of the drivers, the SSO noise distribution is determined. Finally, this distribution is combined with the channel intersymbol interference (ISI) and device noise and jitter distribution of the signals to calculate the bit error rate (BER) of the overall system.