Gibbs Phenomenon for Wavelets

Abstract When a Fourier series is used to approximate a function with a jump discontinuity, an overshoot at the discontinuity occurs. This phenomenon was noticed by Michelson [6] and explained by Gibbs [3] in 1899. This phenomenon is known as the Gibbs effect. In this paper, possible Gibbs effects will be looked at for wavelet expansions of functions at points with jump discontinuities. Certain conditions on the size of the wavelet kernel will be examined to determine if a Gibbs effect occurs and what magnitude it is. An if and only if condition for the existence of a Gibbs effect is presented, and this condition is used to prove existence of Gibbs effects for some compactly supported wavelets. Since wavelets are not translation invariant, effects of a discontinuity will depend on its location. Also, computer estimates on the sizes of the overshoots and undershoots were computed for some compactly supported wavelets with small support.