characterizations of entropy

I was looking up axiomatic characterizations of entropy today and figured I’d share. There are two different axiomatizations I found while rooting around my books. I’m sure there are more, but these have nice plausible arguments. The entropy of a random variable measures the uncertainty inherent to that random variable. Earlier I argued that one should think of this uncertainty in terms of praxis — how many bits it takes to resolve something, or how many bits it can resolve. Here the question is more fundamental: what do we mean by the uncertainty?
Continue reading

Advertisement