Onset Detection through Maximal Redundancy Detection

We propose a criterion, called maximal redundancy’, for onset detection in time series. The concept redundancy is adopted from information theory and indicates how well a signal locally can be explained by an underlying model. It is shown that a local maximum in the redundancy is a good indicator for an onset. It is proven that ‘maximal redundancy’ detection is a statistical asymptotically optimal detector for AR processes. It also accounts for potentially non-Gaussian time series and non- Gaussian innovations in the AR processes. Several applications are shown where the new criterion has been successfully applied.