Note On Mutual Information and Orthogonal Space-Time Codes

Bit-error probability and mutual information rate have both been used as performance criteria for space-time codes for wireless communication. We use mutual information as the performance criterion because it determines the possible rate of communication when using an outer code. In this context, linear dispersion codes, first proposed by Hassibi and Hochwald, are appealing because of the high mutual information they provide, as well as their simplicity. Because complexity increases with the number of symbols, it may be sensible in some settings to fix the number of symbols sent per data bit. In the dissertation of Y. Jiang, it was conjectured that among linear dispersion codes with independent, binary symbols, orthogonal space-time codes are optimal in the following sense: they maximize mutual information subject to an average power constraint on each symbol. We prove the conjecture for a fixed number of real symbols with arbitrary distributions