paper a day : exponential error bounds for AVCs

Exponential Error Bounds for Random Codes in the Arbitrarily Varying Channel
Thomas Ericson
IEEE Transactions on Information Theory, 31(1) : 42–48

I really wish I had read this paper fully before writing my ISIT submission, because the vocabulary and terminology is less confusing here. The title is pretty self-explanatory, as long as you know information theory. An arbitrarily varying channel (AVC) is a probabilistic/mathematical model for a communication scenario in which there is a sender, receiver, and jammer. The jammer wants to minimize the communication rate between the sender and receiver. Assuming there exists some nonzero communication rate that is achievable over all possible strategies of the jammer, Ericson wants to find how the probability of making a decoding error decays as a function of the blocklength of the code. In a heuristic sense, if we make the packets of data larger and larger, how does the error probability scale?

AVCs have a lot of technical junk around them, but one important thing is the distinction between random and deterministic codes. A deterministic code C is a fixed mapping from messages {1, 2, 3, … N} into codewords {x1,x2, …, xN }. The jammer knows this mapping and so can tailor its strategy to be as bad as possible for this codebook. A random code is a random variable C taking values in the set of deterministic codes. If the values that C can take on is exp(n Rz), Ericson calls Rz the key rate of the code. That is, for a blocklength of n, we have nRz bits securely shared between the sender and receiver in order to choose a codebook without the jammer’s knowledge.

Ericson wants to find the largest attainable E such that the probability of decoding error is upper bounded by exp(-n (E – d)) for some small d. This is called the reliability function of the channel. Clearly it will depend on the code rate Rx = n-1 log N and the key rate Rz . Let F(Rx) is the reliability function when Rz is infinite. The first result is that E(Rx, Rz) >= min[ F(Rx), Rz].

In some cases the AVC capacity for deterministic codes is zero. This happens intuitively when the jammer can “pretend” to be the sender, so that the receiver can’t tell how to decode the message. In this case we say the channel is symmetrizable. If the channel is symmetrizable, then in fact E(Rx, Rz) = min[ F(Rx), Rz]. This basically says that the best decay in the error probability is limited by the key rate.

Interestingly enough, this is a similar result to the one that I proved in my ISIT paper, only in my case I did it for the Gaussian channel and I only have an upper bound on the error decay. Strict equality is a bit of a bear to prove.

Anyway, this paper is highly relevant to my current research and probably of interest to about 15 people in the world.

Advertisements