In The Family

On Saturday evening I saw In The Family at the Asian American Showcase. It’s a film by Patrick Wang, who I may have last seen in a production of Grand Hotel at MIT when I was just starting college. It’s a film that is definitely worth seeing — an affecting and truthful story, it may make you tear up at times. It will also make you believe that a deposition can be the most important moment in a person’s life.

The trailer for the movie is here:

The synopsis says

In the town of Martin, Tennessee, Chip Hines, a precocious six year old, has only known life with his two dads, Cody and Joey. And a good life it is. When Cody dies suddenly in a car accident, Joey and Chip struggle to find their footing again. Just as they begin to, Cody’s will reveals that he named his sister as Chip’s guardian. The years of Joey’s acceptance into the family unravel as Chip is taken away from him. In his now solitary home life, Joey searches for a solution. The law is not on his side, but friends are. Armed with their comfort and inspired by memories of Cody, Joey finds a path to peace with the family and closer to his son.

The trailer starts almost towards the end of the film, and I think doesn’t really show the things which are the most beautiful about it. There is a scene after Cody’s funeral when Joey and Chip return to the house, shocked. Joey sits at the kitchen table, and Chip (where do they get these child actors — the kid is amazing!) has a long silent scene in which he gets the mail, climbs on the step stool, gets a glass, gets the Coke from the fridge, pours himself some, gets his dad a beer, opens the beer with some effort, then clinks the bottle and glass for cheers, and that is what snaps Joey out of it and he start sorting the mail. This is what I mean by a truthful scene — in the face of trauma and loss, at some point we go on, as Beckett might say. Watching those moments is important.

So the film is 3 hours long almost. But it’s worth it, because it shows you that kind of truth. Moment by moment. You get to understand what is at stake in this story, why Cody and Chip mean so much to Joey. It’s a beautiful debut film, and was rejected from a number of festivals but they are self-distributing it and it’s going to appear soon in a venue near you, hopefully. Do try to see it — it will move you.

Advertisement

Cover’s test for the irrationality of a coin

Someone hands you a coin which has a probability p of coming up heads. You can flip the coin as many times as you like (or more precisely, you can flip the coin an infinite number of times). Let S = \{r_i : i = 1, 2, \ldots\} be the set of rational numbers in [0,1]. After each flip, you have to guess one of the following hypotheses: that p = r_i for a particular i, or p is irrational. Furthermore, you can only make a finite number of errors for any p \in [0,1] - N_0, where N_0 is a set of irrationals of Lebesgue measure 0. Can you do it? If so, how?

This is the topic addressed by a pair of papers that Avrim Blum mentioned in Yoav Freund‘s remembrances of Tom Cover:

COVER, THOMAS M (1973). On determining the irrationality of the mean of a random variable. Ann. Math. Statist. 1862-871.
COVER & HIRSCHLER (1975). A finite memory test of the irrationality of the parameter of a coin. Annals of Statistics, 939-946

I’ll talk about the first paper in this post.

The algorithm is not too complicated — you basically go in stages. For each time j = 1, 2, \ldots you have a function n(j). Think of n(j) as piecewise constant. There are two sequences: a threshold k_{n(j)}, and an interval width \delta_{n(j)}.

  1. Take the sample mean \hat{x}_{n(j)} and look at a interval of width 2 \delta_{n(j)} centered on it. Note that this makes the same decision for each j until n(j) changes.
  2. Given an enumeration of the set S, find the smallest i such that r_i \in [\hat{x} - \delta_{n(j)}, \hat{x} + \delta_{n(j)}].
  3. I there is an i < k_{n(j)} such that r_i \in [\hat{x} - \delta_{n(j)}, \hat{x} + \delta_{n(j)}] then declare p = r_i, otherwise declare p \notin S
  4. .

The last thing to do is pick all of these scalings. This is done in the paper (I won’t put it here), but the key thing to use is the law of the iterated logarithm (LIL), which I never really had a proper appreciation for prior to this. For \epsilon > 0,

| \hat{x}_n - p | \le (1 + \epsilon) (2 p (1 - p) \sqrt{ \frac{\log \log n}{n} })

for all but finitely many values of n. This gets used to set the interval width \delta_{n(j)}.

The cool thing to me about this paper is that it’s an example of “letting the hypothesis class grow with the data.” We’re trying to guess if the coin parameter p is rational and if so, which rational. But we can only apprehend a set of hypotheses commensurate with the data we have, so the threshold k_{n(j)} limits the “complexity” of the hypotheses we are willing to consider at time j. The LIL sets the threshold for us so that we don’t make too many errors.

There are lots of little extensions and discussions about the rationality of physical constants, testing for rationality by revealing digits one by one, and other fun ideas. It’s worth a skim for some of the readers of this blog, I’m sure. A miscellaneous last point : Blackwell suggested a Bayesian method for doing this (mentioned in the paper) using martingale arguments.