Exponential error bounds for random codes on Gaussian arbitrarily varying channels

The main objective is to develop exponential bounds to the best error probability achievable with random coding on the Gaussian arbitrarily varying channel (GAVC) in the one case where a (strong) capacity exists (i.e., with peak time-averaged power constraints on both the transmitter and interference). The GAVC models a channel corrupted by thermal noise and by an unknown interfering signal of bounded power. The upper and lower bounds to the best error probability achievable on this channel with random coding are presented. The asymptotic exponents of these bounds agree in a range of rates near capacity. The exponents are universally larger than the corresponding exponents for the discrete-time Gaussian channel with the same capacity. It is further shown that the decoder can be taken to be the minimum Euclidean distance rule at all rates less than capacity. >