MAP and ML in practice on the New Jersey Turnpike

Since I might be teaching detection and estimation next semester, I’ve been thinking a little bit about decision rules during my commute down the New Jersey Turnpike. The following question came to mind:

Suppose you see a car on the Turnpike who is clearly driving dangerously (weaving between cars, going 90+ MPH, tailgating an ambulance, and the like). You have to decide whether the car has New Jersey or New York plates [*]?

This is a hypothesis testing problem. I will assume for simplicity that New York drivers have cars with New York plates and New Jersey drivers have New Jersey plates [**]:
H_0: New Jersey driver
H_1: New York driver
Let Y be a binary variable indicating whether or not I observe dangerous driving behavior. Based on my entirely subjective experience, I would say the in terms of likelihoods,
\mathbb{P}(Y = 1 | H_1) > \mathbb{P}(Y = 1 | H_0)
so the maximum likelihood (ML) rule would suggest that the driver is from New York.

However, if I take into account my (also entirely subjective) priors on the fraction of drivers P(H_0), P_H(1) from New Jersey and New York, respectively, I would have to say
\mathbb{P}(Y = 1 | H_1) P(H_1) < \mathbb{P}(Y = 1 | H_0) P(H_0)
so the maximum a-posteriori probability (MAP) rule would suggest that the driver is from New Jersey.

Which is better?

[*] I am assuming North Jersey here, so Pennsylvania plates are negligible.
[**] This may be a questionable modeling assumption given suburban demographics.

A joke for Max Raginsky

Setting: a lone house stands on a Scottish moor. The fog is dense here. It is difficult to estimate where your foot will fall. A figure in a cloak stands in front of the door.

Figure: [rapping on the door, in a Highland accent] Knock knock!

Voice from inside: Who’s there?

Figure: Glivenko!

Voice: Glivenko who?

Figure: Glivenko-Cantelli!

[The fog along the moor converges uniformly on the house, enveloping it completely in a cumulus.]

Scene.

A teaser for ITAVision 2015

As part of ITAVision 2015 we are soliciting individuals and groups to submit videos documenting their love of information theory and/or its applications. During ISIT we put together a little example with our volunteers (it sounded better in rehearsal than at the banquet, alas). The song was Entropy is Awesome based on this, obviously. If you want to sing along, here is the Karaoke version:

The lyrics (so far) are:

Entropy is awesome!
Entropy is sum minus p log p
Entropy is awesome!
When you work on I.T.

Blockwise error vanishes as n gets bigger
Maximize I X Y
Polarize forever
Let’s party forever

I.I.D.
I get you, you get me
Communicating at capacity

Entropy is awesome…

This iteration of the lyrics is due to a number of contributors — truly a group effort. If you want to help flesh out the rest of the song, please feel free to email me and we’ll get a group effort going.

More details on the contest will be forthcoming!

How many people have “met Shannon?”

I saw a paper on ArXiV yesterday called Kalman meets Shannon, which got me thinking: in how many papers has someone met Shannon, anyway? Krish blogged about this a few years ago, but since then Shannon has managed to meet some more people. I plugged “meets Shannon” into Google Scholar, and out popped:

Sometimes people are meeting Shannon, and sometimes he is meeting them, but each meeting produces at least one paper.