Via Serdar Yüksel and the IT Society, the 2014 IEEE North American School on Information Theory will be held at the Fields Institute in Toronto this June. The lecturers for this school are:
My friend Ranjit is working on this Crash Course in Psychology. Since I’ve never taken psychology, I am learning a lot!
Some thoughts on high performance computing vs. Map Reduce. I think about this a fair bit, since some of my colleagues work on HPC, which feels like a different beast than a lot of the problems I’ve been thinking about.
Some of my office furniture is on backorder, like the standing desk unit and my actual desktop, but in the meantime I have found a use for the hardcopy IEEE Transactions that I’ve been carting around with me from job to job:
Due to weather issues, I was unable to make it on time to ITA to give my talk, which is based on an ArXiV preprint with Francesco Orabona, Tamir Hazan, and Tommi Jaakkola. The full work will be presented at ICML 2014 this summer. I decided to give the talk anyway and upload it to YouTube (warning: single take, much stammering):
I plan to post a bit more about this problem later (I know, promises, promises), but in the meantime, this talk is mostly background about the MAP perturbation framework.
I think during my time hanging out with machine learners, no topic has received as much attention as the quality of the review process for competitive conferences. My father passed along this paper by Graham Cormode on “the tools and techniques of the adversarial reviewer”, which should be familiar to many. I had not seen it before, but a lot of the “adversarial” techniques sounded familiar from reviews I have received. I also wonder to what extent reviews I have written could be interpreted as deliberately adversarial. I don’t go into the review process that way, but it’s easy to ascribe malign intent to negative feedback.
Cormode identifies 4 characteristics of the adversarial reviewer: grumpiness, elitism, peevishness, and arrogance. He then identifies several boilerplate approaches to writing a negative review, specific strategies for different sections of the paper, and the art of writing pros and cons for the summary. My favorite in this latter section is that the comment “paper is clearly written” really means “clearly, the paper has been written.”
As Cormode puts it himself at the end of the paper: “I am unable to think of any individual who consistently acts as an adversarial reviewer; rather, this is a role that we can fall into accidentally when placed under adverse conditions.” I think this is all-to-true. When reviewing the 9th paper for a conference with 3 weeks to do all 9, the patience of the reviewer may be worn a bit thin, and it’s easy to be lazy and not take the paper on its own merits. What’s certainly true, however, is that “editors and PC members” often do not “realize when a review is adversarial.” In part this is because as a research community, we don’t want to acknowledge that there are real problems with the review process that need fixing.
With the recent furor over Penguin’s decision to pulp copies of Wendy Doniger‘s The Hindus after some intense pressure, I was reminded of Delhi University’s decision to ban a much less “controversial” essay by A.K. Ramanujan entitled Three Hundred Ramayanas. It’s a wonderful piece of writing and well worth a read. What it points to is the vast plurality of traditions and interpretations.
I made a bookmarklet for the Rutgers Library’s proxy server by changing the URL format from the UChicago ProxyIt! link:
You can cut and paste that into a link in the bookmarks bar of your browser.
I finally snagged a few minutes to crunch some numbers from this year’s Mystery Hunt and add them into the data we have from the past few years. Firstly, the chart everyone likes to see, the hunt duration:
The spike from the epic 2013 Hunt is a bit aberrant, but overall it took less time to find the coin this year as opposed to the 2008-2009 Hunts.
The next graph is the total number of hunters, which has been generally increasing over time:
This is somewhat alarming, since running a hunt for 2000 hunters is a significantly different challenge than running one for 1000 hunters. A number of caveats are in order. Many of these numbers are estimates, and this does not disaggregate remote from on-site solvers. Because each organizing team gathers statistics differently, the thing we can infer from this is that the Hunt has been growing over time.
Finally, the most interesting chart (at least to me): the distribution of team sizes:
This year we had a record-breaking 62 teams register (actually more, but some dropped out — I’m looking at you, Fangorn Foureast). The growth in this chart is not because we have more mega-teams (100+ people) — there are only 3 of those, after all. The biggest change in the last 3 years is the total number of teams in the under-50 (or under-40, really) category. We have a large number of moderate-sized teams who need enough space for their HQs that they can’t do it out of their dorm rooms. This growth in the number of smaller teams is part of why adopting a design philosophy like Erin’s is important.
I’ll have to dig through the raw numbers from this year’s hunt to get more specifics about the split of this year’s hunters. As Erin pointed out, with the increase in “smaller” teams, the question is who is on these teams — mostly students? A post for another time, I imagine.
I’ve been at a lot of different institutions over the last few years, and I think that there are number of things that new graduate students in can do on their own to get them the mindset and skills to do research more effectively. An advisor is not even needed! This advice is of course oriented towards more technical/theory types in engineering, but some of it is general. Note: I say research-oriented because there are many MS programs where students don’t really care too much about research. On the one hand, this is still good advice for them, but on the other hand, they are not trying to find a PhD advisor.
- Go to lots of seminars. This was some great advice I got from Anant Sahai when I was starting grad school. As soon as you get to grad school, sign up for all of the seminar mailing lists in your department and outside your department that you think may be interesting to you. For me it was statistics, networking/communications/DSP, one of the math seminars, and some of the CS seminars. Go to the talk, take notes, and try to understand what the problem is, why it’s important, and what tools they use to solve it. Without the right classes you may not understand the technical aspects of the talk, but you will learn about different areas of active research, how to present research (or how not to, sometimes), and new tools and terminology that may not be covered in coursework. You may see a paper referenced that you would want to look at later. Faculty will see that you’re interested in research and trying to learn something outside of class. Go to talks outside your area to learn some new things. Go to broad-audience colloquium talks to understand trends and developments across other areas of engineering outside of your interests.
- Read papers regularly. This is hard. You’re not going to understand the papers. But much like learning a foreign language, you have to read and then make notes of things that you don’t understand and want to look up later. At first, read the abstract, introduction, model, and main results, or as much as you can handle. It will be confusing, but you will get a sense of what research is being done, what kinds of questions people ask, and so on. Bookmark the things that sound interesting so you can come back to it later. Set aside a little time every few days to do this. It’s like exercise — you have to practice regularly. Read broadly so you can get a sense of how different problems/models/questions relate to each other.
- Learn LaTeX if you don’t know it already. There is nothing worse than trying to write your first paper and trying to learn LaTeX at the same time. You can practice by trying to write up a homework solution or two in LaTeX. In general, being familiar with the tools used in research before you actually “need” them is a great idea.
- Learn to program. I’m still a mediocre programmer, but I’m trying to get better. Most entering grad students in ECE don’t know MATLAB beyond the level of doing homework assignments. You don’t have to become a code ninja, but learning to write and document code that others can read, and that you can debug easily, will save a lot of headaches down the road.
- Make a website for yourself. You want to be top hit when someone searches your name and institution. It doesn’t have to have a ton of information on it, but it makes a difference. I’ve seen job candidates who somehow don’t have a homepage with information about their publications and papers. In this day and age, the first thing people are going to do after meeting you at a conference is Google you.
In general, entering graduate school can be quite daunting, and many students fall into the trap of just taking a bunch of classes in search of “what’s interesting.” The dirty secret is that most first-year graduate courses don’t have a lot of active research topics in them (maybe this is a problem). If you’re interested in doing research, you need to practice by expanding your horizons through going to talks and reading papers, building technical skills like programming and writing LaTeX effectively, and professionalizing by making a website to communicate your interests and research.
I occasionally enjoy Thai cooking, so I appreciated some of the comments made by Andy Ricker.
I recently learned about India’s Clean Currency Policy which went into effect this year. I still have some money (in an unpacked box, probably) from my trip this last fall, and I wonder if any of it will be still usable when I go to SPCOM 2014 this year. That sounded a bit crazy to me though, further investigation indicates that an internal circular leaked and it sounds like a more sensible multi-year plan to phase in more robust banknotes. My large-ish pile of Rs. 1 coins remains useless, however.
There’s been a lot of blogging about the MIT Mystery Hunt (if I wasn’t so hosed starting up here at Rutgers I’d probably blog about it earlier) but if you want the story and philosophy behind this year’s Hunt, look no further than the writeup of Erin Rhode, who was the Director of the whole shebang.
Last year I did a lot of flying, and as a result had many encounters with the TSA. This insider account should be interesting to anyone who flies regularly.