Signal Processing Applications Of The Bootstrap
暂无分享,去创建一个
he bootstrap is a relatively recently developed statistical method that has many attractive properties. It is more flexible than “&&cap statistical methods, relies on fewer mathematical assumptions, and has very wide applicability. The bootstrap was introduced by Efron in 1979, although similar ideas had been suggested in different contexts earlier. Indeed, the closely related jacldinife method was introduced by QueiiouiUe as early as 1949. The basic idea of the bootstrap quite simply is to use the sample data to compute a statistic and to estimate its sampling dstribution, but without malung any model assumptions. This is done by repeatedly drawing a large number of random samples from the original sample. The statistic is then computed for each of the pseudo samples, and the resulting empirical distribution is an estimate of the sampling distribution. In other words, the original sample is treated as a surrogate for the population. This simple device can be applied in a wide variety of situations to obtain estimates of confidence intervals and perform hypothesis tests, for instance. While its usefulness in nonparametric situations is clear, even for parametric problems the bootstrap completely circumvents the need to derive complicated sampling dstributions. The main drawback of the bootstrap is that it is a computer-intensive method, although with advances in technology this is less and less of a concern. Indeed one The first article, “ComputerIntensive Methods in Statistical Analysis,” by Dimitris Politis is concerned with the theoretical aspects of bootstrap. It is a self-contained introduction to bootstrap and provides an extensive bibliography on this topic for the interested reader. The second article, by A.M. Zoubir and B. Boashash, titled “The Bootstrap and its Application in Signal Processing,’’ provides a brief introduction to the