Today there were a lot fewer talks that I was really interested in. There was an invited session on neural information theory that I went to in the morning, and then I spent some time revising my slides for Thursday.
Plenary : Exciting directions in Algebraic Coding Theory (Alex Vardy)
Alex Vardy’s a great speaker, and he gave a really clear talk about coding theory and where it is going. He brought out some exciting open problems in classical coding theory that have deep connections to other areas of mathematics, a good overview of list decoding for Reed-Solomon codes. I think it was videotaped — I hope they put the video online sometime, since I missed the first 10 minutes.
-
Universal Quantile Estimation with Feedback in the Communication-Constrained Setting (Ram Rajagopal, Martin Wainwright, Pravin Varaiya)
This is a distributed estimation setting — a bunch of different sensors observe iid copies of some random variable X and want to let a central processor compute a quantile of the source. They can each send one bit per round to the central processor, and the central processor can feed back one bit to all of the sensors. Ram proposed two schemes, and showed they are both order-optimal when measured against a centralized estimator which has access to all of the sensor’s observations. It’s pretty surprising, actually.
-
Achieving the Gaussian Rate-Distortion Function by Prediction (Ram Zamir, Yuval Kochman, Uri Erez)
If you have a stationary Gaussian process, one way of compressing it is to first filter it so that it looks iid (this is called whitening) and then compress the whitened source (called the innovations). Unfortunately, this is strictly suboptimal, so prefiltering and quantizing is bad. If you instead put the quantized in the prediction loop (as in DPCM), you can trim out the fat an actually achieve the true rate-distortion. I didn’t really know about this problem, but it seems pretty cool to me.
-
Awards Banquet
The joint IT/Comm paper award went to the DUDe paper on discrete universal denoising. My only response is that “the DUDe abides.” The IT paper is going to Orlitsky et al’s work on compressing sources whose alphabet size is unknown, which has connections to Good-Turing estimation and is pretty darn cool.
-
Covering spheres and balls with smaller balls (Ilya Dumer)
I don’t have much to say here, but the result is a tightening of older bounds on how many spherical caps you can put onto the sphere such that you cover the whole sphere. It’s kind of up my alley and the title is pretty funny.
-
Rényi to Rényi — Source Coding under Siege (Michael Baer)
Another fun title! The problem here is pretty interesting — imagine you are under siege and a spy returns from the field but is so exhausted that he will die shortly. You can ask only yes or no questions to the spy and you need to find out a piece of information before he dies. It’s like 20 questions with a time limit — you need to maximize the chance that you get the answer out. Another example is if you have a connection to a base station that has an unknown time duration and you need to communicate a non-uniform source to maximize the chance of getting the full message across before the connection dies. Baer shows a connection to Huffman coding and gets some improved bounds.
-
Information rates subjected to state masking (Neri Merhav, Shlomo Shamai (Shitz))
This is like dirty paper coding, but slightly different. Here there is a channel with a state known acausally at the encoder, but the encoder has two objectives: communicate at a rate R with the decoder and minimize the ability of the decoder to infer the channel state. The latter is done via the equivocation function. It’s a pretty cool setup, I think, and they solve the Gaussian case completely. I have to read the paper to get the full insights, but it definitely seems relevant to things that I’m interested in and working on.
As a note, I’m not quite sober… but at poker night tonight the possibility of what life would be like with four balls came up and the prospect of a writing a paper on the maximal packing of said four balls came up. (I’m the only female at the table, so I just roll with these things.) Perhaps I should send the guys the reference to the fourth talk for further research on the topic…
Actually, I should probably just drink less at poker night. Or at least not comment on your blog afterwards.
Good luck with your talk tomorrow!
Just so you know, your blog is the second hit when you google siti 2006. (Robbie saw one of your posts on my friends page and wanted to know where the conference is being held.)