Futurama and information theory

According to the article The Truth About Bender’s Brain in the May 2009 issue of IEEE Spectrum, one of the writers of Futurama, Ken Keeler, “confirms that he does read every issue of IEEE Spectrum and occasionally looks at the Transactions on Information Theory.” So is it true that source coding people are more funny than channel coding people?


they’re doing it by degrees

Obviously it’s clear I’m a big nerd, but thinking back on the wonder that was Square One, I am astonished at how much work they put into their cultural references/parodies. You can dance if you want to… as long it is the Angle Dance!

On the official website you can watch the some other clips. If they ever released this show on DVD I would queue up to buy it.

Postprint server for the California Digital Library

I’m surprised that there are fewer people using the Postprint server at the California Digital Libary (CDL). This is a service for University of California students, staff, and faculty that lets you post your paper in its final version and makes it freely available. The nice thing is that they pay attention to all the copyright details for you so you don’t have to sort that out. There are a few good reasons to use it : (a) it’s good to archive your work with your “employer,” (b) it helps promote open access to postprints for disciplines which don’t use ArXiV, and (c) they can give you statistics on number of downloads.

Just looking through it seems that a small fraction of the papers written by UC people make it into the repository. The University of California does not have a mandate for open access like Harvard does, but provides this as an option. As Michael Mitzenmacher has noted, mandates can become tricky, which is why the CDL’s managing of the copyright thing is nice.

Students = lemmings?

Apparently students are lemmings:

But a new study by a trio of international researchers finds that college undergraduates let their peers influence their choice of major, often leading them into careers that were not best suited to their skills — and ultimately diminished their income… if the study’s results hold up, students might be encouraged to think for themselves.

The whole thing seems a little suspect to me, and I’m not sure it really generalizes to the US. They studied business students at an Italian university with two choices only (business or economics) and then had a number of group divisions like “ability driven” and “peer driven” to divide them. It just sounds like way too many variables to control for.

However, reading the article reminded me of the game Lemmings, which was awesome in that early 90s way…

Infocom 2009 : capacity of wireless networks

A Theory of QoS for Wireless
I-Hong Hou (University of Illinois at Urbana-Champaign, US); Vivek Borkar (Tata Institute of Fundamental Research, IN); P. R. Kumar (University of Illinois at Urbana-Champaign, US)
This paper claimed to propose a tractable and relevant framework for analyzing the quality of service in a network. In particular, the model should capture deadlines, delivery ratios, and channel reliability while providing guarantees for admission control and scheduling. The model presented was rather simple: there are N clients trying to access a single access point in slotted time. The clients generate packets periodically with period \tau and each packet has a deadline of \tau. The channel quality is captured by a client-dependent success probability for the uplink; the downlink always succeeds. For the uplink, the access point selects a transmitter via polling and the client sends the packet until it is received. The questions are then to ask whether a set of clients and rates is feasible and what policy can achieve these rates. The paper presents an outer bound and two policies based on notion of workload w_n = (1/p_n) (q_n/\tau), where q_n is the delivery ratio of user n. Policies based on largest “debt” first are shown to be optimal, where “debt” can be the difference of w_n and the fraction of time given to user n or a weighted delivery ratio.

Computing the Capacity Region of a Wireless Network
Ramakrishna Gummadi (University of Illinois at Urbana-Champaign, US); Kyomin Jung (Massachusetts Institute of Technology, US); Devavrat Shah (Massachusetts Institute of Technology, US); Ramavarapu Sreenivas (University of Illinois at Urbana-Champaign, US)
This looked at wireless networks under the protocol model of Gupta and Kumar and asks the following question : if I give you a vector of rates \mathbf{r} is there a way to tell if \mathbf{r} is in the capacity region of the network? Previous work had looked at polynomial time algorithms for scheduling, but apparently in the wireless setting the scheduling problem is NP hard. The question they look at instead is testing if there exists a routing scheme which leads to feasible link rates. This can be reduced to approximating “max-weight independent set” (MWIS), which I am not so familiar with. I think it relates to the backpressure algorithm. In any case, they find a polynomial time approximation assuming the communication graph has polynomial growth, which holds for many nice graphs (like random geometric graphs with standard connectivity assumptions).

The Multicast Capacity Region of Large Wireless Networks
Urs Niesen (Massachusetts Institute of Technology, US); Piyush Gupta (Bell Labs, Alcatel-Lucent, US); Devavrat Shah (Massachusetts Institute of Technology, US)
The network model is n users placed uniformly in the square [0, \sqrt{n}]^2. This looks at the capacity region assuming arbitrary multicast demands between users. So each node in the network has a set of receivers to which it would like to multicast its message. What traffic demands are admissible? The framework is one of scaling laws as the number of nodes goes to infinity, under the protocol model and enforcing multi-hop routing. The main result is that the capacity region is equal to the cutset region within an n^{o(1)} multiplicative factor (on both the upper and lower bounds), and furthermore the cutset region can be evaluated with fewer constraints. The achievable scheme involves building a “multiresolution tree” of sorts and then showing that routing in the original problem is the same as routing in the tree. The tree structure gives the constraint reduction for the cutset bound. To actually route the traffic, they develop a “message concentration” scheme that is reminiscent of other hierarchical cooperation schemes but I guess is a bit different.

IEEE : no longer an acronym

I didn’t realize this, but IEEE is not officially an acronym anymore:

The IEEE name was originally an acronym for the Institute of Electrical and Electronics Engineers, Inc. Today, the organization’s scope of interest has expanded into so many related fields, that it is simply referred to by the letters I-E-E-E (pronounced Eye-triple-E).

Was this recent, or has it been like that for a while? Perhaps it’s the family history, but I always thought that IEEE was supposed to be an acronym…