George makes the following argument: let X be a binary random variable that equals 1 if Iran has an active nuclear weapons program and 0 if not. Suppose that last month we knew that P(X = 1) = p > 1/2. Then we can measure our uncertainty about X via its entropy:
H(X) = hb(p) = – p log p – (1-p) log (1-p)
Here hb(p) is the binary entropy function. Now let Y be a random variable representing the NIE. We know that conditioning reduces entropy:
H(X | Y) ≤ H(X)
Let p’ be our new probability that X = 1 conditioned on the evidence Y. We cannot have p’ < p, because then hb(p’) > hb(p), which is a contradiction. Therefore p’ ≥ p and therefore the NIE shows that the chance Iran has an active nuclear program is even higher than before.
Exercise: Explain the error(s) in George’s argument.
Extra Credit: Write a short essay explaining why one should not abuse information theory for political ends.