The Gibbs phenomenon for piecewise-linear approximation

In 1899 J. W. Gibbs [1] of Yale, in response to a letter in Nature by the American physicist A. Michelson [2], presented a result about Fourier series that now goes by the name of the Gibbs phenomenon. Michelson had complained about an undesired "overshoot" effect that occurred whenever he approximated a function having a jump discontinuity by a finite Fourier series. Gibbs was able to show that this overshoot does not disappear as the number of terms in the series becomes arbitrarily large. Rather, on each side of the discontinuity, it approaches a constant g times one-half the size of the jump, where