Essential Coding Theory Problem Set 2

Problems 1. Prove the noiseless coding theorem, and its converse. (But don't turn in.) 2. Consider a Markovian source of bits, where the source consists of a 6-cycle with three successive vertices outputing 0, and three successive vertices outputting 1, with the probability of either going left (or right) from any vertex is exactly 1/2. Compute the rate of this source. (I expect an ab initio argument. Hopefully this will motivate you to look up Shannon's general method for computing the rate of a Markovian source.) 3. Consider a binary channel whose input/output alphabet is {0, 1}, where a 0 is transmitted faithfully as a 0 (with probability 1), but a 1 is transmitted as a 0 with probability 1 2 and a 1 with probability 1/2. Compute the capacity of this channel. (You should prove this from scratch using only simple probabilistic facts already stated/used in class-not by referring to tools gleaned from other courses in information theory. For partial credit, you may just prove a lower bound on the capacity. The higher your bound, the more the credit.) 4. If there is a constructive solution to Shannon's noisy coding theorem with E being a linear map, then show that there is a constructive solution to Shannon's noiseless coding theorem in the case where the source produces a sequence of independent bits of bias p. Clarifications: (a) The encoding and decoding functions used in the noiseless theorem should be polynomial time computable, if the corresponding functions are polynomial time computable in the noisy theorem. (b) The compression rate in the noiseless coding theorem should be arbitrarily close to H(p), assuming the rate of the encoding function in the coding theorem can be made arbitrarily close to 1 − H(p).