For those readers of the blog who have not submitted papers to machine learning (or related) conferences, the conference review process is a bit like a mini-version of a journal review. You (as the author) get the reviews back and have to write a response and then the reviewers discuss the paper and (possibly, but in my experience rarely) revise their reviews. However, they generally are supposed to take into account the response in the discussion. In some cases people even adjust their scores; when I’ve been a reviewer I often adjust my scores, especially if the author response addresses my questions.

This morning I had the singular experience of having a paper rejected from ICML 2014 in which all of the reviewers specifically marked that they did not read and consider the response. Based on the initial scores the paper was borderline, so the rejection is not surprising. However, we really did try to address their criticisms in our rebuttal. In particular, some misunderstood what our claims were. Had they bothered to read our response (and proposed edits), perhaps they would have realized this.

Highly selective (computer science) conferences often tout their reviews as being just as good as a journal, but in both outcomes and process, it’s a pretty ludicrous claim. I know this post may sound like sour grapes, but it’s not about the outcome, it’s about the process. Why bother with the facade of inviting authors to rebut if the reviewers are unwilling to read the response?

S. Raj Rajagopalan and collaborators at Honeywell are doing some security research on making better passwords. They are looking for some people to do a quick study on password design.

Along with a couple of Honeywell security researchers I am running a study on a rather familiar problem for most of us – creating memorable but secure passwords, i.e. how to generate passwords that are both suitably random and memorable. We have just launched a simple user study that asks volunteers to participate in an interactive session that lets them choose password candidates and see how well they remember them. Needless to say, these are not actual passwords used by any system, only strings that could be used as passwords.

No personal information is collected in the study and the system only stores the data that is actually provided by the user. To that end, you may choose to not provide any bit of information as you choose. The study takes only a couple of minutes to finish. You may run it multiple times if you wish (and you will likely get different use cases) but you will have to clear the cache on your browsers to get a fresh configuration.

We need at least 300 participants to get statistical significance, so we would appreciate it if you could participate in the study.

Please click here to go to the study: http://138.91.115.120:8080/syspwd

Thanks for your help. Any questions on the study may be directed to me.

Raj

Via Cynthia, here is a column by James Mickens about how horrible the web is right now:

Computer scientists often look at Web pages in the same way that my friend looked at farms. People think that Web browsers are elegant computation platforms, and Web pages are light, fluffy things that you can edit in Notepad as you trade ironic comments with your friends in the coffee shop. Nothing could be further from the truth. A modern Web page is a catastrophe. It’s like a scene from one of those apocalyptic medieval paintings that depicts what would happen if Galactus arrived: people are tumbling into fiery crevasses and lamenting various lamentable things and hanging from playground equipment that would not pass OSHA safety checks.

It’s a fun read, but also a sentiment that may echo with those who truly believe in “clean slate networking.” I remember going to a tutorial on LTE and having a vision of what 6G systems will look like. One thing that is not present, though, is the sense that the system is unstable, and that the introduction of another feature in communication systems will cause the house of cards to collapse. Mickens seems to think the web is nearly there. The reason I thought of this is the recent fracas over the US ceding control of ICANN, and the sort of doomsdaying around that. From my perspective, network operators are sufficiently conservative that they can’t/won’t willy-nilly introduce new features that are only half-supported, like the in Web. The result is a (relatively) stable networking world that appears to detractors as somewhat Jurassic.

I’d argue (with less hyperbole) that some of our curriculum ideas also suffer from the accretion of old ideas. When I took DSP oh-so-long ago (13 years, really?) we learned all of this Direct Form Transposed II blah blah which I’m sure was useful for DSP engineers at TI to know at some point, but has no place in a curriculum now. And yet I imagine there are many places that still teaching it. If anyone reads this still, what are the dinosaurs in your curriculum?

  1. Song For The Sold – Kishi Bashi
  2. Calling – Snorri Helgason
  3. Heart Beats – Hey Marseilles
  4. We Get Along – Sharon Jones & The Dap-Kings
  5. 1904 Twin Peaks – Alice Russel
  6. We Don’t Sleep – Har Mar Superstar
  7. Digital Witness – St. Vincent
  8. Nothing You Could Say – Big Eyes
  9. Jump And Shout – The Dirtbombs
  10. Diversionary (Do the Right Thing) – Ages and Ages
  11. Another Ten Reasons – Young Fresh Fellows
  12. Where Can I Go? – Laura Marling

My college friend Ann Marie Thomas has a post up on the problematic use of the word “failure” in the discourse around education, technology, and design. I was listening to Dyson extoll the virtues of failure Science Friday recently, and it also made me cringe. Ann talks about the problems with “failure” from an education and design point of view, but I think it’s also problematic from teaching/training students to be researchers. One of the most normalizing things I heard in graduate school from my advisor was “well, that’s research for you” after I told him I had found a counterexample to everything I had “proved” in the previous week. I don’t think of that as “embracing failure” but rather a recognition that the process is not one of continuous forward progress.

The sound-bite nature of the word does a disservice to the valuable concept, which is, as Ann says, to “try something.” I think it’s not (often) true that students are afraid to try things because they are afraid to fail. It’s far more likely that they are unsure of how to try things, or what to try. The problem is too abstract and it’s hard to find any sort of inroad that might make sense. Or they have thought they have an inroad and it’s absolutely not working and they are frustrated because they can’t step back and say “this approach is bad.”

I can’t help but think that this talk of “failure” is somehow leaking in from positive psychology. I think it treats us like children who may be afraid to go down some stairs because they are too tall, or afraid to try the new food because it looks funny. It obscures the really difficult part, which is about where to start, not how you end.

Via Serdar Yüksel and the IT Society, the 2014 IEEE North American School on Information Theory will be held at the Fields Institute in Toronto this June. The lecturers for this school are:

A taste test for fish sauces.

My friend Ranjit is working on this Crash Course in Psychology. Since I’ve never taken psychology, I am learning a lot!

Apparently the solution for lax editorial standards is to scrub away the evidence. (via Kevin Chen).

Some thoughts on high performance computing vs. Map Reduce. I think about this a fair bit, since some of my colleagues work on HPC, which feels like a different beast than a lot of the problems I’ve been thinking about.

A nice behind-the-scenes on Co-Op Sauce, a staple at Chicagoland farmers’ markets.

Follow

Get every new post delivered to your Inbox.

Join 861 other followers