Most amusing rejection so far

Thank you for applying to the Department of Computer Science at Columbia University. I am sorry to tell you that we will be unable to offer you a position this year. The Computer Science department’s recent call for applications for new faculty members generated several hundred responses. Our delight at receiving so many applications was muted by the realization that we would be unable to talk with a large number of excellent candidates.

phone interviews

Suresh posted a few months ago about academic phone interviews and asked “maybe it’s because there are more people chasing each slot and so these filters are more necessary now?” I’ve had a few phone interviews this year, with some turning into on-campus interviews and some not. Although it’s considered a thing that only smaller departments will do, I actually think the phone interview has a lot of positive features that make sense for lots of departments:

  • You can screen a much larger set of candidates — it’s probably quite difficult to decide on 6 people to invite for on-site visits out of 300 applicants. Phone interviews let you screen out those who seem under-prepared, un-interested in your job (i.e. they applied just because it was there). If someone’s research is not really in your area (e.g. a department with no information theory people), it is a good chance to get the candidate to explain it to you rather than puzzling through the research statement. This also saves money.
  • You can talk to unknown candidates — of course if your advisor is great friends with someone at school X then chances are that person will know your name (or at least your advisor’s name on your CV). But hiring people you know personally may be a suboptimal strategy long-term, so phone interviews let you broaden your search.
  • It can be done in a decentralized manner — you don’t need the whole committee to be there on the phone call. Divide and conquer!
  • If your search is pretty broad, then you can talk to a few people in several different areas. This means you can find the best-sounding candidate in each area and then the committee can try to compare good apples and good oranges instead of the whole motley cornucopia.
  • From the interviewee’s perspective, you get to learn quite a bit more about the department, its priorities, and the culture from a 30 minute chat on the phone. You get this from the questions they ask as well as the questions you get to ask. That’s definitely the sort of thing which you can’t get from the website.
  • It provides good feedback for the interviewee — if you get a phone interview, you know you’ve made some sort of list (medium, short, whatever) and that knowledge is helpful, given the uncertainty mentioned in my previous post.

That’s not to say I necessarily enjoyed all of the phone interviews; the phone is an awkward medium. But I do think on balance that they are a good way to improve the search process from the employer and job-seeker side. Besides, I’m not sure I look my best in Skype video chats…

Some thoughts on the job market

I’m interviewing this spring to find my next gig after this postdoc, which is a convenient way for me to excuse my lack of posting. Applying for jobs is in a way a job in itself, with attendant time sinks and things popping up, etc. One thing that struck me is the sheer inefficiency of the process. This is my third time applying, and I think I sent in about 60 applications (most of which I had no chance for, in retrospect) for academic and research lab positions. Most of my comments here relate to the academic market.

Different places want different things. Some schools don’t want a cover letter. Some do. Some want you to email the application as a single PDF. Some want you to fill out half the information on your CV into a web form and then also submit your CV. Some schools want a combined two-page research and teaching statement, and some want them separately (or with page requirements for each). Some don’t want any teaching statement. Some schools want letters sent directly, some will email a link to your recommenders, some want hardcopy letters, and some will request letters only from a few applicants. Some want 3 letters, some 5, and some up to 8. Some places have a common interface like AJO. Many schools use the same software package (like RAPS at Columbia).

The bewildering variety of formats makes it hard for applicants to keep their recommenders (who are busy people) informed. I sent my recommenders endless emails with lists of which schools wanted what, which schools they should have heard from, and which schools will only contact them if I made the first cut (in which case, could they let me know for my own records?). What if your application somewhere is rejected because they sent an automated email to your letter writers without informing you and it was eaten by their spam filter? This would hardly be fair, but I imagine that it does happen. I’m not sure what is to be done, but it seems like moving to a common format like the AMS Coversheet may not be a bad move, or using some kind of letter warehousing service.

Another related factor which contributes to inefficiency and psychological distress is the lack of feedback regarding the status of one’s application. I got a rejection letter from last year’s job search in October of this year. Did they really made the decision only then, or were they just flushing their buffer? I’d prefer a form rejection letter early to the ambiguity even from the place that wants a “mixed-signal circuit” expert but welcomes “excellent candidates in all areas.” Just getting an email saying “sorry, you’re not a good fit” can help refocus the applicant’s attention on those openings which are still “open.” It’s a buyers market — there are 300 applicants for each open position, so perhaps departments don’t have time to send all of those letters. But emails are cheap!

There’s no real way to make the application process less time-consuming, but I think it can be made less confusing and less draining. The question is how, and what is the incentive for employers?

Linkage

Between travel and lingering reviews, I have not had any time to really write anything particularly interesting or technical. I have a lot of thoughts, just not much willpower to write them down at the moment. In the meantime, be amused/saddened/scared/entertained by these links…

Out of Context Science.

Rep. Keith Ellison testifies at Rep. Peter King’s McCarthy-esque “hearings.” I’m sure people have seen the terrifying video from Orange County.

The Dayenu Principle applied to films beating you over the head. Enough already!

David Rees on America’s Next Great Restaurant: “Life’s too short not to eat kale every five minutes.”

The way we are treating Bradley Manning is immoral and illegal. If the first doesn’t bother you, the second should.

Goodnight, Dune, goodnight, Shai-hulud bursting out of the dune.

I should eat more cauliflower.

My friend Reno is famous on the internet!

Online voting is like drunk driving

So one of the stories that circulated during the EVT/WOTE workshop last summer revolved around a presentation given by Ron Rivest at a special workshop on Uniformed and Overseas Citizens Absentee Voting Act (UOCAVA) in which he compared online voting to drunk driving. Today I saw that he has in fact posted the slides. Why the fuss? Apparently the default solution was to conduct voting for military personnel posted in say, Afghanistan, via the internet. There are a raft of security issues with this, as outlined in the slides. They are pretty amusing, except when you realize that they will probably do the voting over the internet thing anyway.

Concert bleg : Sacra/Profana is Excandescent

SACRA/PROFANA (dir. Krishan Oberoi) presents…

EX/CAN/DES/CENT*
*a. [L. excandescentia .] 1. Growing hot; white or glowing with heat

Saturday, February 19 · 7:00pm – 8:30pm
St. Peter’s Church
off of 15th St., Del Mar, CA
Sunday, February 20th at 4 p.m.
Village Community Presbyterian Church
6225 Paseo Delicias, Rancho Santa Fe

San Diego’s virtuosic vocal ensemble SACRA/PROFANA returns in a
program exploring themes of light and illumination. New works by young
American composers will be highlighted, including winners of the 2010
Choral Composition Contest. Also featuring music by Poulenc, Britten
and The Smashing Pumpkins, plus György Ligeti’s enigmatic choral
masterpiece “Lux Aeterna”.

15th Street Chamber Music says:”SACRA/PROFANA features some of the
premier young voices from all around San Diego and Southern California
and has a fresh new sound with a thrilling take on the art of choral
music.”

More information at: www.sacraprofana.org.

There is a $10 suggested donation at the door. Reserve seats are
available by emailing us at:
15thstreetchambermusic@gmail.com

Quote of the day : the grip of darker powers

Philosophers seem singularly unable to put asunder the aleatory and the epistemological side of probability. This suggests that we are in the grip of darker powers than are admitted into the positivist ontology. Something about the concept of probability precludes the separation which, Carnap thought, was essential to further progress. What?”

Ian Hacking, The Emergence of Probability, Cambridge : Cambridge UP, 1975.

ITA 2011 : zombie blogging

I came down with the flu at the tail end of ITA, so I proceeded to fail at a bunch of things, like submitting reviews that were due, writing a submission for ISIT, and blogging the ITA Workshop in time. Cosma already blogged about his favorite talks, beating me to the punch.

I basically missed the first day because of technical glitches with our registration system, but once that was all resolved things went a bit more smoothly (at least I hope they did). The poster session seemed well-attended, and we shot videos of all the posters which will be posted at some point. Ofer did a great job arranging the Graduation Day and poster events. The thing about these conferences is that you end up wanting to talk to people you haven’t seen in a while, and it’s good to hammer out research ideas during the daytime, so I only made it to the keynote and Xiao-Li Meng’s tutorial on MCMC. I felt like I followed the tutorial at the beginning, but even Prof. Meng’s engaging speaking style lost me when it came to modeling something about stars (?). But there will be videos posted of the tutorials soon enough as well. I’ll probably make a post about those. For those who were at the entertainment program, of course the video for that was top priority. For the small number of those blog readers who wish to know what I was making:

  • 2 oz. Maker’s Mark bourbon
  • 1 oz. Carpano Antica formula vermouth
  • 1 dash Angostura bitters
  • 1 dash Regan’s No. 6 orange bitters

Shaken with ice, served up with a cherry. I opted for a bourbon Manhattan with a cherry rather than a rye Manhattan with an orange twist (or without garnish) because it was more convenient, and also more 1960s versus craft cocktail.

But on to the talks! I did manage to drag my lazy butt to some of them.

Improved rate-equivocation regions for secure cooperative communication
Ninoslav Marina, Hideki Yagi, H. Vincent Poor
They looked at a model where you have a transmitter and also a “blind” helper who is trying to help communicate over a wiretap channel. They show a better achievable rate-equivocation region by introducing another auxiliary random variable (big surprise!), but this doesn’t affect the best secrecy rate. So if you are willing to tolerate less than full equivocation at the eavesdropper then you’ll get an improvement.

Shannon’s inequality
S. Verdú, Princeton
Sergio talked about an alternative to Fano’s inequality used by Shannon:
P_e \ge \frac{1}{6} \frac{ H(X|Y) }{ \log M + \log \log M - \log H(X|Y) }
It was a nice talk, and the kind of talk I think is great at ITA. It’s not a new result, but ITA is a place where you can give a talk that explains some cool connection or new idea you have.

On the zero-error capacity threshold for deletion channels
Ian A. Kash, Michael Mitzenmacher, Justin Thaler, Jon Ullman
A nice piece of work on connecting zero-error capacity for deletion channels with longest common subsequences. The error model is adversarial. You can make a graph where each vertex is a length-n binary string, and connect two vertices if the two strings have a longest common subsequence of length at least (1 - p)n. If two strings are connected then they can’t be in the same code since an adversary could delete p n bits and create the common subsequence (note : not substring). So you can get a bound on the capacity by getting a bound on the largest independent set in this graph. So then you can use… Turan’s Theorem! Hooray! There are more results of course…

Data-driven decision making in healthcare systems
Mohsen Bayati, Stanford, Mark Braverman, U Toronto, Michael Gillam, Microsoft, Mark Smith, Medstart Health, and Eric Horvitz, Microsoft Research
This was a nice talk on doing feature selection via ideas from sparse signal processing/machine learning. The idea is to find a small set of features to help predict whether a patient is high-risk or low-risk for being readmitted soon after being discharged from the hospital. The idea is that the number of features is huge but the number of data points is small. They do an L1 penalized logistic regression and then derive a threshold based on the cost of doing an intervention (e.g. house-visits for high-risk patients).

Tracking climate models: advances in Climate Informatics
Claire Monteleoni, Columbia CCLS, Gavin A. Schmidt, NASA and Columbia, Shailesh Saroha, and Eva Asplund, Columbia Computer Science
This was an overview of Claire’s work on climate informatics. The basic problem was this : given several models (large-scale simulated systems based on PDEs etc. derived from physics) that predict future temperature, how should you combine them to produce more accurate predictions. She used some tools from her previous works on HMMs to get a system with better prediction accuracy.

On a question of Blackwell concerning hidden Markov chains
Ramon van Handel
The problem is trying to estimate the entropy rate of a process that is a function of a Markov chain (and hence not a Markov chain itself). “Does the conditional distribution of an ergodic hidden Markov chain possess a unique invariant measure?” This was a great talk for the Blackwell session because it started from a question posed by Blackwell and then revisited a few of his other works. Pretty amazing. Oh, and the paper (or one of them).

I think more talks will have to wait for another time (if ever).

Readings

The Solitudes (John Crowley) – The first book in the Aegypt Cycle, as recommended by Max. This book really blew my mind. I don’t really see it as “fantasy” but an expansive meditation on memory and history. It’s the first book in a 4-part series, and I’m looking forward to finishing the rest of the cycle.

Impossible Subjects: Illegal Aliens and the Making of Modern America (Mae M. Ngai) – a fascinating scholarly history of the idea of the “illegal immigrant” that provides a much needed context for our contemporary debate on the subject. Ngai shows how recent our notions of citizenship and immigration are vie the history of the debates and revisions of statues from the 19th and 20th centuries. Highly recommended.

The Toughest Indian in the World (Sherman Alexie) – a collection of short stories by one of the most famous contemporary Indian novelists. It surprised me and shocked me at times, but some of the stories and images really stuck with me. I haven’t read Alexie’s other collections so I don’t know how it compares.

The Human Use of Human Beings (Norbert Wiener) – Wiener’s general-audience book on cybernetics has lots of gems that I’ve been blogging here and there. I found it interesting because at the time his ideas were somewhat new, and now they either seem antiquated or have been absorbed into out “default” view of things.

Absurdistan (Gary Shteyngart) – A madcap farce involving a massively overweight and fabulously wealthy Russian Jew trying to muddle his way through a massively dysfunctional Central Asian nation. It’s over the top and some readers may not enjoy the narrator’s neuroses, but it was pretty funny, if raw.

Numbers Rule (George Szpiro) – This is another book on the history of voting and electoral apportioning schemes. Szpiro takes us chapter-by-chapter through famous figures in the history of voting — from Ramon Llull through the Marquis de Condorcet and Charles Dodgson to Kenneth Arrow. A very entertaining read, if less of a page-turner than Poundstone’s book. The sections on choosing the number of representatives for each state in the House is pretty fascinating.