Fundamental neural structures, operations, and asymptotic performance criteria in decentralized binary hypothesis testing

Fundamental neural network structures in decentralized hypothesis testing are considered. For binary hypothesis testing, the basic neural operations are established, and the Neyman-Pearson criterion is utilized due to information theoretic arguments. Then, two fundamental neural structures are considered, and analyzed and compared in terms of asymptotic performance measures. In particular, the asymptotic relative efficiency performance measure is used to establish performance characteristics and tradeoffs in the two structures, for both parametrically and nonparametrically defined hypotheses. In the latter case, robust neural network structures are considered, and their superiority to parametric network structures is argued.<<ETX>>