# SPCOM 2014: tutorials

I just attended SPCOM 2014 at the Indian Institute of Science in Bangalore — many thanks to the organizers for the invitation! SPCOM 2014 happens every two years and is a mix of invited and submitted papers (much like Allerton). This year they mixed the invited talks with the regular talks which I thought was a great idea — since invited papers were not for specific sessions, it makes a lot more sense to do it that way, plus it avoids a sort of “two-tier” system.

I arrived early enough to catch the tutorials on the first day. There was a 3 hour session in the morning and another on the in afternoon. For the morning I decided to expand my horizons by attending Manoj Gopalkrishnan‘s tutorial on the physics of computation. Manoj focused on the question of how much energy it takes to erase or copy a bit of information. He started with some historical context via von Neumann, Szilard, and Landauer to build a correspondence between familiar information theoretic concepts and their physical counterparts. So in this correspondence, relative entropy is the same as free energy. He then turned to look at what one might call “finite time” thermodynamics. Suppose that you have to apply a control that operates in finite time in order to change a bit. One way to look at this is through controlling the transition probabilities in a two-state Markov chain representing the value of the bit you want to fix. You want to drive the resting state (with stationary distribution $(1/2,1/2)$ to something like $(\epsilon, 1 - \epsilon)$ within time $T$. At this level I more or less understood what was going on, but since my physics background is pretty poor, I think I missed out on how the physical intuition/constraints impact what control strategies you can choose.

Prasad Santhanam gave the other tutorial, which was a bit more solid ground for me. This was not quite a tutorial on large-alphabet probability estimation, but more directly on universal compression and redundancy calculations. The basic setup is that you have a family of distributions $\mathcal{P}$ and you don’t know which distribution $p \in \mathcal{P}$ will generate your data. Based on the data sample you want to do something: estimate some property of the distribution, compress the sample to a size close to its entropy, etc. A class can be weakly or strongly compressible, or insurable (which means being able to estimate quantiles), and so on. These problems turn out to be a bit different from each other depending on some topological features of the class. One interesting thing to consider for the machine learners out there this stopping time that you need in some analyses. As you are going along, observing the data and doing your task (estimation, compression, etc) can you tell from the data that you are doing well? This has major implications for whether or not an online algorithm can even work the way we want it to, and is something Prasad calls “data-driven compressible.”

I’ll try to write another post or two about the talks I saw as well!

I’m sick today so here are some links.

Click That Hood, a game which asks you to identify neighborhoods. I was lousy at San Diego, but pretty decent at Chicago, even though I’ve lived here for half the time. Go figure.

For those who care about beer, there’s been some news about the blocked merger of Inbev and Modelo. I recommend Erik’s podcast post on the structure of the beer industry (the three-tier system) for those who care about craft beer, and (with reservations) Planet Money’s show on the antitrust regulatory framework that is at work here.

Remember step functions from your signals and systems course? We called them Heaviside step functions after Oliver Heaviside — you can read more about him in this Physics Today article.

I need this album, since I love me some Kurt Weill. I can also live vicariously through NPR’s list of SXSW recommendations.

Posting has been nonexistent this week due to being busy and incredibly tired. Hopefully the improved spring weather will thaw me out. On the upside, I’ve been reading more.

Speaking of race, the Chronicle of Higher Education published a piece mocking the whole field of Black Studies based on reading the titles of (proposed) dissertations (and a paragraph description). Tressie mc had a trenchant response. The faculty and students also responded.

And segueing from race via race and statistics (and eugenics), most of Galton’s works are now online.

Dirac’s thoughts on math and physics.

A touching film about 9/11 from Eusong Lee from CalArts.

Links to videos and a special chair.

James Baldwin debates William F. Buckley, Jr. I’ve only seen part of it so far, but it’s pretty interesting (via Ta-Nehisi Coates).

I’ve heard quite a bit about the treatment of agricultural workers in Florida, particularly in tomato farming, but this video with a representative of the Coalition of Immokalee Workers is a good introduction to what is going on there (via Serious Eats). The book Tomatoland is on my reading list.

I didn’t know the origin of the term swizzle-stick until now.

I’m a big fan of Cowboy Bebop, and Shinichiro Watanabe has a new show out called Sakamichi no Apollon (via MeFi). I watched the first episode, and the Art Blakey album Moanin’ features prominently, so I think I’m going to like this show quite a bit. It’s being streamed in a ad-heavy format on Crunchyroll.

That’s a lot of pendulums. That’s right, pendulums.

Why don’t you relax a little in the bear chair?