I think this is the end of my ITA blogging! But there were some issues that came up during the conference that may be of interest to some of the readers of this blog (although from anecdotal reports, there are many people who read but never comment, so I’m not sure what to do to encourage more discussions).

The ECE department at The University of Texas at Austin seeks highly qualified candidates for postdoctoral fellowship positions, lasting up to two years, in the information sciences, broadly defined. Applicants should have, or be close to completing, a PhD in ECE, CS, Math, Statistics or related fields.

Vijay Subramanian passed along this job opening in case readers know of someone who would be interested…

FULL PROFESSOR, HAMILTON INSTITUTE, NATIONAL UNIVERSITY of IRELAND MAYNOOTH

The Hamilton Institute at the National University of Ireland Maynooth invites applications for a Chair position starting in Summer 2013. Appointment will be at full professor level. Exceptional candidates in all areas will be considered, although we especially encourage candidates working in areas that complement existing activity in the mathematics of networks (distributed optimisation, feedback control, stochastic processes on graphs) as applied to smart transport, smart city data analytics and wireless networks.

The Hamilton Institute is a dynamic and vibrant centre of excellence for applied mathematics research. The successful candidate will be a leading international researcher with a demonstrated ability to lead and develop new research directions. A strong commitment to research excellence and a successful track record in building strategic partnerships and securing independent funding from public competitive sources and/or through private investment are essential.

Informal enquires can be directed to Prof. Doug Leith (doug.leith@nuim.ie), Director of the Hamilton Institute. Details on the Hamilton Institute can be found at www.hamilton.ie.

Further information on the post and the application procedure can be found here.

The deadline for applications is 11th Feb 2013.

Despite Yury‘s attempts to get me to “stop blogging,” here is my much-delayed recap of Allerton. Because of moving to TTI-Chicago right after Allerton and then almost immediately shuttling back to UCSD for the iDASH Privacy Workshop, things have been a bit delayed. I could only attend for two days but I wanted to highlight a few interesting talks that I saw. More Allerton blogging was done by Michael Mitzenmacher (part 1, part 2) and a bit by Maxim Raginsky about his talk on causal calculus (since he blogged about it I don’t have to, ha!). The conference has gotten too big and the rooms are too small to hold the audience, so it probably is time to move the thing. We have similar issues at ITA and the 2012 ITA Workshop is moving off campus next year (you heard it here first, folks!)

But here are some interesting talks I saw:

“The first phase [of life] you believe in Santa Claus, the second you don’t believe in Santa Clause, and in the third you become Santa Claus.” — Tony Ephremides

Prof. Anthony Ephremides gave the third plenary at ISIT, which included an interim report on the consummation between information theory and network coding and many connections between opera and research. He did a great job I think in explaining the difference between throughput region, stability region, and capacity region (under bursty vs. non-bursty use). These are in increasing order of inclusion. Some interesting tidbits (some new, some not):

• He had a complaint about the way networking handles fading by simply saying the success probability for a packet being received is just $\mathbb{P}( SNR > \gamma)$.
• Under contention, the capacity region may not be convex, unlike in information theory where you can do time sharing.
• For wireless network coding it is important to connect the MAC protocol and scheduling issues as well as change the notion of cut capacities. That is, you shouldn’t replace edges with hyper-edges, because that’s not a good model.
• The information rate is the rate from the payload plus the information in the idleness and the information from the identity of the transmitter. That is, you get data from the actual packet, when the packet was sent, and who is sending the packet.
• Extending many analyses to more than 2 users has not been done.
• Can bursty traffic capacity be larger than non-bursty? The NNN (*) community says that would be a contradiction, because you can always then emulate bursty traffic. But this is unfair because idling the transmitter to build up packets and hence create artificial bursts is not allowed in the problem formulation, so there is no contradiction.
• Relaying is good from a network perspective because it can partially enable a first-come first-serve (FCFS) discipline. So relays bridge a gap to the optimal scheduling policy. This is different than the information theory notion of cooperation yielding diversity gain.
• Multicast throughput is an interesting thing to think about more rigorously for the future.
• His prescription : the information theory has to apply its tools more broadly to networking problems and to unorthodox problems. The networking community should use more rigorous modeling and analysis methods.

(*) NNN = Nagging nabobs of negativism (really it should be “nattering nabobs of negativism,” Safire’s famous alliterative phrase wielded with such gusto by Spiro Agnew).