What is "synchrony suppression"?

Synchrony of discharge of auditory neurons to two-tone stimuli and "synchrony suppression" have been analyzed by examining the implications of the definition of vector strength. Synchrony suppression, defined as the reduction in the vector strength for one component when a second is introduced, occurs by definition when partial ("half-wave") rectification occurs in an otherwise linear system. It does so with the usual shifts (on the abscissa) of empirical vector strength curves, disproving any necessity for compressive or other nonlinearities. Synchrony suppression is sometimes defined incompatibly as the shift in dB of a vector strength curve--said to be the magnitude of suppression. That this conception is incorrect is shown by the identification of partial rectification with vector strength reduction and curve shift, but it can be shown to be a logical fallacy as well. The vector strength definition was also applied to the complex waveform obtained at the output of an instantaneous amplitude compressive nonlinearity. The shifts of vector strength growth and decay curves (at their crossover points) necessarily equal those in the linear case for any compressive nonlinearity that compresses equal inputs equally. But such a compressive nonlinearity is not without noticeable effects on vector strengths. If the input levels lie in the range leading to compressed outputs, differences in the relative input levels will be accentuated in the relative output levels in the period histogram. Compression thus contributes to greater differences in the vector strengths, for unequal input levels, than in the linear case. More visible effects on vector strength curves result from waveform distortion, which reduces vector strength saturation and crossover values and causes them to recede at higher input levels.