A consistent, numerically efficient Bayesian framework for combining the selection, detection and estimation tasks in model-based signal processing

Bayesian marginalization is shown to measure the complexity of a model, objectively quantifying Ockham's Razor via the Ockham parameter inference (OPI). This is not possible in likelihood-based inference. The OPI rejects any hypothesis which is poorly supported by the data. This leads to the Censored Marginal A Posteriori (CMaAP) estimation policy which returns confident estimates well below the maximum likelihood (ML) thresholds. CMaAP estimation performs an alternative-free hypothesis test, thereby subsuming detection. In a multi-hypothesis environment, the procedure combines selection and estimation into a consistent framework which avoids the numerical approximations of the Bayesian evidence approach and the MDL (minimum description length) criterion. It is therefore robust in stressful regimes and affords major computational savings.<<ETX>>