Analysis of a single-wavelength optical buffer

We present a detailed analysis of the loss performance in an optical buffer having access to a single outgoing channel. Such a system - consisting of a number of fiber delay lines - differs significantly from a conventional electronic buffer, in that only a discrete set of delays can be realized for contention resolution. This leads to an underutilization of the channel capacity, which reduces overall performance. Our analysis does not require any special assumptions about the burst- or packet-size distribution, which allows us to study the impact this distribution has on performance. For the important special case of fixed-sized bursts, it reveals, amongst others, that matching fiber delay line length with burst duration is not necessarily the optimal solution in terms of loss performance. It further reveals that, in general, this optimal solution is function not only of burst-size characteristics, but of the offered load as well, making the buffer design process a delicate task.