ISIT 2007 : source coding

Source Coding with Distortion through Graph Coloring (Vishal Doshi, Devavrat Shah, and Muriel Médard) : This paper looked at the rate-distortion function for reconstructing a function f(X,Y) with X at the encoder and Y as side information at the decoder. One can think of this as a “functional” Wyner-Ziv, or in some cases as a noisy Wyner-Ziv. In the lossless setting, the reconstruction function can be found using graph entropy, and minimum-entropy graph coloring turns out to be a way of obtaining a solution. For the lossy problem, they find a similar graph coloring scheme but can’t get a single-letter expression in all cases. What is interesting to me is the optimality of “preprocess + Slepian-Wolf,” which is similar in spirit to the work done by Wang, Wagner, Viswanath and others for Gaussian multiterminal source coding problems.

Compound conditional source coding, Slepian-Wolf list decoding, and applications to media coding (Stark C. Draper and Emin Martinian) : The main motivation here was that many multimedia applications (such as video coding) may more fruitfully be thought of as compound source coding problems with an unknown side information at the decoder. In this setting, we can imagine the side information as one of P different predictors of the source X available at the encoder. The encoder can use the different predictors to list decode the source message and send conditional list-disambiguation information in addition to the source encoding. It’s a neat scheme that seems quite related to some of my thesis work on list decoding for AVCs.

Correlated Sources over a Noisy Channel: Cooperation in the Wideband Limit (Chulhan Lee and Sriram Vishwanath) : I have to admit I didn’t fully get this talk, but it looked at wideband distributed source coding using “highly correlated sources.” They propose a modified PPM scheme which can exploit the correlation as if the encoding is joint with small bit error rate (but possibly larger block-error rate). What was unclear to me was why the modified error criterion was necessary, but it seems to be an artifact of the proposed scheme. The algorithm requires a sliding window decoder whose analysis seems a bit tricky.

Joint Universal Lossy Coding and Identification of Stationary Mixing Sources (Maxim Raginsky) : What is the loss in estimating the parameter of a source and doing universal lossy source coding? By using a competitive optimality framework and a Lagrangian formulation to trade off the parameter error and source distortion, Raginsky can bound the loss in performance. This falls into the category of the O(log n/sqrt(n)) results that I don’t know much about, but I will probably take a look at the full paper to get a better idea of how the codes work. He uses some ideas of VC dimension from learning theory, which I know a little about, so hopefully it will not be too hard going…

The source coding game with a cheating switcher (Hari Palaiyanur, Cheng Chang, Anant Sahai) : This is an extension of Berger’s source coding game, in which a switcher switches between k memoryless sources and you have to make a rate distortion code that can handle the worst case behavior of the switcher. Before, the switcher could not see the source outputs first. Here, he can (hence “cheating”). The main point is to figure out which iid distributions the switcher can emulate, and the worst one in that set gives the bound. The rest is a union bound over types and doesn’t affect the rate.

How much detail should a review version have?

One pesky problem that seems to pop up when I write or review papers is the “minor algebra error due to space constraints.” You have some theorem and then you go back and redefine x to be 2 y instead of y/2 and then suddenly stuff is off by a factor of 4 everywhere, but that doesn’t matter for the result per se — it’s just some constant floating around. This is of course an issue with conference papers, since they have strict page limits, and you end up shortening proofs to sketches, but it also happens while revising journal papers. One of the jobs of the reviewer is to check that the algebra works out, which becomes tedious if all the algebra is in fact correct but the paper skips 5 lines of simplifications and you have to go and work it out yourself.

Which brings me to the modest proposal : when submitting a journal paper, put in all the algebra, with a little footnote saying that you’ll omit the intermediate steps in the final version. This way, the checking for correctness becomes almost mechanical. Sure, it may make the submitted manuscript look bloated, but then the time saved can be spent on checking the structure of the argument. As an added benefit, the writer will be forced to explore the full ramifications of changing the notation around. Of course, this wouldn’t be possible (probably) for conference papers, but would it help for the journal process?

resizing delimiters in LaTeX with line breaks

IEEE uses a two-column format that is a bit narrow for large formulae, and it makes parenthesis resizing a pain when you have to break lines, because LaTeX (apparently) will not match parenthesis sizes across lines. For example, consider

\mathbb{P}\left(\frac{1}{N}\sum_{i=1}^{N}\mathbf{1}(\mathbf{y}(Tc\in{D(Z_{i,c})})>G\right)

So if you have a \exp \left( followed by some tall expression, like - \sum_{i=1}^{n} \frac{1}{2^i} \int_{\mathbb{R}} \langle f_i(t), g(t) \rangle dt + \prod_{i=1}^{n} f_i(0) - \lim_{x \to \infty} \frac{g(t)}{2 \pi} you start to run into problems fitting the whole thing on the line so that the corresponding \right) fits within the page margin. Furthermore, if the equation has multiple opening brackets and different size elements, the opening and closing brackets may not match in size when you break the line.

My old hack for this was to manually resize the \left( by using \Big\left( or something like that, putting empty \right. commands before the line break, and then starting the next line with empty \left. commands. If you have multiple opening and closing brackets you have to futz around, putting a \Big or \Bigger around each delimiter to make it fit, but a (somewhat) easier hack is to insert a tall whitespace like this:

\\rule{0pt}{15pt} \\right. \\right. \\nonumber \\\\
&
\\left. \\left. \\rule{0pt}{15pt}

This isn’t too great a savings, since I now have to resize 2 things instead of 4, but it’s something at least, and the delimiters end up the same size. I could probably write a macro to do this, but that seems like a waste of time.

regular blogging resumes

I apologize for all the server issues I’ve been having (it’s a long story involving caching and file permissions issues) — I think things are back up to normal now, and I should get some posts up about Paris and ISIT up sometime soonish…

ISIT 2007 : general impressions

I had meant to start blogging ISIT as it was going on, but that turned out to be infeasible due to a number of factors. The most relevant was that I spent the first 3 days of this year’s conference battling a fever, so by the time I got back to the hotel I was in no mood to type up any of the notes I had taken from the talks, and I also had to leave early the first two days so that I could lie down. Another downside was that my attention was not as sharp as it should have been, so I apologize in advance for the facile nature of my observations.

In general, the program this year felt way more Shannon-theory heavy to me than last year’s. Perhaps that’s because I wasn’t hanging out with many coding theorists, so I didn’t hear about the talks, but I felt like the coding talks in which I was interested were few in number. Furthermore, I ended up missing the one or two that I really wanted to attend (such as the new Reed-Solomon list decoding algorithm).

My biggest gripe was the lack of water at the sessions — there were two snack breaks, between the two morning and two afternoon sessions. There was juice, water, coffee and caffeinated tea, and everything was cleared out immediately upon the resumption of the sessions. I understand taking away the coffee and juice, but there was no water to be had in the session rooms and none in the convention area unless you bought it at the overpriced bar. Perhaps that’s in the contract with the convention center, but it’s a lousy deal.

This year they were also quite zealous in checking that everyone had their conference badge to prove that they had registered. The stinginess with respect to the water complemented (in some sense) their concern that freeloaders not be allowed into the sessions. I left my badge at the hotel one day and was made to get a temporary one for that day.

The big news of course was that Bob Gray won the Shannon Award for next year, which made me happy. I heard some people say beforehand that he seemed like an outside shot since he’s not a pure information theorist per se, and that’s what the award is about. In some sense Sergio Verdú is an Information Theorist’s information theorist, and I think one would be hard-pressed to find a candidate like him. You wouldn’t say Kailath is a pure IT guy either, right?

All in all, I had a good time, and had a few interesting research discussions with different people. I find myself wanting to work on some new problems, which is good from the perspective of learning some new areas, but bad from the perspective of trying to wrap things up and write my thesis. I was asked by several people if I was graduating soon and what I was planning on doing next, which is always a terrifying question. Hopefully I’ll get some of that sorted out this summer…

my trip to jolly olde England

I spent part of the week before last in Cambridge, UK, which is a very different town than Cambridge, Mass., although it has some of the same problems with (for lack of a better term) local deformation effects in the street mappings. Another point of similarity is the density of bookstores, although I have to say that I prefer the ones here a bit, because some of them specialize. I was particularly tempted by a music store which had Eisler’s score to Mahagony and some nice Purcell collections and…

But I digress. Some unfortunate things I missed — punting on the Cam, any and all May Balls or Garden Parties (although if they think I’d shell out 500 quid to go to some poncy event at a college in which some of the boys have the gall to show up in a black suit with a bow tie rather than a proper tuxedo they’d have another thing coming), and a day trip to Oxford that had to be cancelled at the last minute (sorry, Jeff!).

I did manage to meet up with my ex wife for a pint and a somewhat languid production of Alan Ayckbourne’s Bedroom Farce, which was spot-on in some moments (with some pretty effective physical comedy) and churned in place for others (long pauses for laughs that did not come). I should read the play to see how much Ayckbourne writes in himself and how much he leaves to the director. Some of the characters are quite brilliant. Perhaps it’s because I now know some married people that I can see the types a bit better — the last time I saw a play of his was in high school I think. In particular, Trevor is a real piece of work. “I’m a destroyer,” he says, trying to puzzle out why his relationships go sour.

I’ve been staying here with my friend Tony, who I hung out with last year in Seattle before ISIT 2006. Perhaps next year he will be close to Toronto for ISIT 2008, but I somewhat doubt it. We get to have fun arguments about Bayesian statistics and other light topics. On Friday we managed to make it into London, where we got 5-quid tickets to see The Merchant of Venice. It was a period production, and I think part of the appeal of going to the Globe is to get a historicized experience of “what it was like back then.” The ushers are pretty ridiculous — there was plenty of room to sit in the “groundlings area” but they would not let you sit down at all. Because if was trying to be more “authentic,” the production did little to tone down the Jew-hating in the script or contextualize Shylock’s position, which I found quite problematic. The audience brings its own cultural context to a production, and to present a play as a cultural artifact outside of its original cultural context is misleading, I think. The audience should be alienated from its own cultural Gestalt in order to get a critical perspective on a “historical performance.” But maybe that’s just me blathering on a bit.

After the play we went to the Tate Modern (just next door!) and saw the Global Cities exhibition, which was pretty awesome. They took a number of major global cities and compared them, asking tough questions about whether better urban planning can really solve our problems. We also got to see an exhibit on Surrealism and its influence, which was pretty cool. I saw a few minimalist pieces that I remembered from the exhibit at the MOCA in LA. We didn’t pay to see the Heitor Oiticica exhibit, but they had a few of his other pieces outside (perhaps to entice you to pay?) which I thought were pretty cool in the “uses motors and so on in a fun way.”

All ln all I enjoyed my time in England, even if they don’t know what to do with vegetables and a “salad” to them is “grated carrots, sauteed mushrooms, two slices of tomato, and pickled cabbage.” Perhaps the next time I come back I’ll see more of London, but Scotland also seems quite appealing…

warning : ISIT 2007 blogging to begin soon

I was completely ill for the first day of ISIT so I only made it to a handful of talks, but I have notes and will be posting about the conference soon, hopefully. Let this be a warning to those who read this blog and are bored by information theory…

UPDATE : still ill, which did not help my talk today. Hopefully I’ll be better tomorrow morning for talk number 2…