Fundamental neural structures, operations, and asymptotic performance criteria in decentralized binary hypothesis testing
暂无分享,去创建一个
Fundamental neural network structures in decentralized hypothesis testing are considered. For binary hypothesis testing, the basic neural operations are established, and the Neyman-Pearson criterion is utilized due to information theoretic arguments. Then, two fundamental neural structures are considered, and analyzed and compared in terms of asymptotic performance measures. In particular, the asymptotic relative efficiency performance measure is used to establish performance characteristics and tradeoffs in the two structures, for both parametrically and nonparametrically defined hypotheses. In the latter case, robust neural network structures are considered, and their superiority to parametric network structures is argued.<<ETX>>
[1] Pramod K. Varshney,et al. Distributed Bayesian signal detection , 1989, IEEE Trans. Inf. Theory.
[2] George V. Moustakides,et al. Robust detection of signals: A large deviations approach , 1985, IEEE Trans. Inf. Theory.
[3] R. Ellis,et al. Entropy, large deviations, and statistical mechanics , 1985 .
[4] Richard E. Blahut,et al. Hypothesis testing and information theory , 1974, IEEE Trans. Inf. Theory.