Time-frequency analysis is a major tool in representing the energy distribution of time-varying signals. There has been a lot of research on various properties of these representations. However, there is a general lack of quantitative measures in describing the amount of information encoded into a time-frequency distribution. Recently, information-theoretic measures such as entropy and divergence have been adapted to the time-frequency plane to quantify the complexity of individual signals as well as the difference between signals. In this paper, we present a variety of information-theoretic measures and their definitions on the time-frequency plane. The properties of these measures and how they can be applied to signal classification problems are discussed in detail. We then present an application of information-theoretic signal processing to the analysis of event- related brain potentials.
[1]
M. Basseville.
Distance measures for signal processing and pattern recognition
,
1989
.
[2]
Khaled H. Hamed,et al.
Time-frequency analysis
,
2003
.
[3]
Olivier J. J. Michel,et al.
Measuring time-Frequency information content using the Rényi entropies
,
2001,
IEEE Trans. Inf. Theory.
[4]
Terrence J. Sejnowski,et al.
An Information-Maximization Approach to Blind Separation and Blind Deconvolution
,
1995,
Neural Computation.
[5]
William J. Williams,et al.
Uncertainty, information, and time-frequency distributions
,
1991,
Optics & Photonics.