tracks [a night in the life]

This is a mix for my dear friend Sin. It’s a little more episodic than the other two. Again, a mix of old standbys and some newer things.

Wrath, Ire, Fury
1. Dies Irae (Sir Georg Solti and the Chicago Symphony Orchestra and Chorus/G. Verdi)
2. Misery Is A Butterfly (Blonde Redhead)
3. How Fucking Romantic (The Magnetic Fields)
Preparation
4. Angel With An Attitude (The Ditty Bops)
5. Just One Of Those Things (Ella Fitzgerald/Cole Porter)
6. He’s Funny That Way (Billie Holiday)
Transit
7. Dodi (Don Byron)
8. Skokiaan (Louis Armstrong)
9. Ibid, Désmarches, Ibid (DJ Spooky)
10. Tancas serradas a muru (Dawn Upshaw/Osvaldo Golijov)
Clubs?
11. Sous Le Soleil De Bodega (Les Negresses Vertes)
12. Fame (David Bowie)
13. Obsolete (MC Solaar)
14. Remind Me (Röyksopp)
Nightcap
15. In The Waiting Line (Zero 7)
16. Shiki No Uta (Minmi & Nujabes)
17. One For My Baby (And One More For The Road) (Harold Arlen/Johnny Mercer)
Reflections
18. Auch Kleine Dinge (Dietrich Fischer-Dieskau/Hugo Wolf)
19. Autopsicografia (Antonio Carlos Jobim)
20. Desafinado (Ryuichi Sakamoto and Paula Morelenbaum/Jobim)

tracks [Rhode to Cincy]

This is for a Rhode trip to Cincinnati. I tried to branch out on this one a little bit more in terms of artists. I haven’t given it a test drive yet, but I’m sure someone will give me feedback on it.

1. Start Wearing Purple (Gogol Bordello)
2. The Purple People Eater (Sheb Wooley)
3. Inflammatory Writ (Joanna Newsom)
4. Stone Cold Dead In The Market (Ella Fitzgerald and Louis Jordan)
5. Old Car Blues (Paul Kotheimer)
6. Are You Gonna Be My Girl (Jet)
7. Moon Over The Freeway (The Ditty Bops)
8. A Summer Song (Chad & Jeremy)
9. Kaze Wo Atsumete (Happy End)
10. Prelude in E Major [WTC Book I] (Glenn Gould/J.S. Bach)
11. Fugue in E Major [WTC Book I] (Glenn Gould/J.S. Back)
12. Azerbaijan Love Song (Dawn Upshaw/Luciano Berio)
13. Tank! (Yoko Kanno)
14. Qui Veut (Ol’Kainry)
15. Dancing In The Street (David Bowie and Mick Jagger)
16. Saturday Night Fish Fry (Buddy Guy)
17. She Caught The Katy (The Blues Brothers)
18. Misty Mountain Hop (Led Zeppelin)
19. I Walk The Line (Johnny Cash)
20. I’ll Follow The Sun (Don Byron/The Beatles)

tracks [traveling thither and yon]

One of three compilations on which I’m working, and the first to be finished. This is for Deb — a fanciful voyage thither and yon, since she can’t seem to stay in one place for more than a year. Pretty standard stuff — I always pick the same few artists over and over again.

1. Mona Liao Announcement (Michael Ouellette/Carolyn Chen)
2. Get Out Of Town (Ella Fitzgerald/Cole Porter)
3. Positively 4th Street (Bob Dylan)
4. Downtown Train (Tom Waits)
5. Come Back From San Francisco (Magnetic Fields)
6. Road To Nowhere (Talking Heads)
7. Alabama-Song (Ute Lemper/Kurt Weill)
8. Take Me Home Country Roads (John Denver)
9. Take The A Train (Duke Ellington)
10. Yellow Submarine (The Beatles)
11. A Foggy Day (Dakota Staton)
12. Le Bateau Ivre (Charlie Hunter Quintet)
13. Siberian Sleighride (Raymond Scott)
14. Thousands Are Sailing (The Pogues)
15. Fujiyama (Dave Brubeck)
16. Fly Me To The Moon (Frank Sinatra)
17. World Weary (Noël Coward)

Coca-Cola Co. can’t do math

I decided to have a Coke today and in order to scare myself into not drinking them anymore, I looked at the nutrition facts label.

Nutrition Facts Standard Serving This Package
Serving Size 8 fl oz 20 fl oz
Servings per container 2.5 1
Calories 100 240

Is this just another example of how Americans are falling behind in math? Or do calories not scale linearly with volume?

Actually, there is an explanation — apparently 8 fl oz = 240 ml, but 20 fl oz = 591 ml! So really the problem is that the metric system of measurement is nonlinear. That’s because it was invented by Europeans, who are trying to cheat us, clearly. That, or Americans are falling behind in math.

paper a day (month?) : isomap

A Global Geometric Framework for Nonlinear Dimensionality Reduction
J. B. Tenenbaum, V. de Silva, J. C. Langford
Science, v290, p2319 – 2323, 22 December 2000.

I have to present this paper for a computational neuroscience reading group that I can (finally) attend. The basic problem is something called manifold learning. Imagine you have a very large data set in a huge number of dimensions — for example, 64 x 64 pixel images of faces, which live in a 4096-dimensional space. Furthermore, suppose that all of the pictures are of the same person, with only two parameters changing — the angle of rotation of the face, and the illumination. The data has only two degrees of freedom, so you would think it would live on a 2-dimensional subspace.

Unfortunately, the data you have doesn’t occupy a linear subspace in the observed variables. Instead, they live on a manifold, which is like a surface in your high dimensional space that may be strangely curved or twisted, and so may be very poorly approximated by a linear subspace. However, the manifold does have its own coordinate system, and you can
calculate distances between points on the manifold. The shortest path between two points is called a geodesic.

Another way to visualize this is a ribbon that is curled into a spiral. A point A on the ribbon might look close to a point B that is on an outer ring, but if you unwrapped the ribbon they would be far apart. So one thing you might think to do is somehow figure out what the real distance (on the surface of the ribbon) between A and B is. You might be able to do that by somehow hopping from data-point to data-point in short hops that would hopefully follow the contour of the ribbon.

That is exactly what the Isomap algorithm, described in this paper, does to perform its dimensionality reduction. Given a set of points {xk : k = 1, 2,…, K} in n-dimensional space X, they first make a graph G with vertex set {xk} by putting an edge between two points if the distance between them in X is less than some threshold. The edge has a weight given by the distance. Then they find a K x K matrix D whose (i,j)-th entry is the minimum-weight (distance) path in the graph G between xi and xj. Assuming the threshold is not too large and there are a lot of data points, these lengths should closely approximate the true (geodesic) distance on the manifold.

Armed with this matrix of distances between points, we can try to embed the manifold into a d-dimensional Euclinean space without distorting the distances too much. This is an easy problem to visualize — just imagine taking a globe, fixing a few cities, and trying to make a flat map in which the distances between those cities are preserved. There is an algorithm called multidimensional scaling (MDS) that can do this. You can trade off the embedding distortion with the number of dimensions.

This paper comes with its own homepage, which has some data sets and MATLAB code. If only all practitioners were so generous — too often the algorithm implementation is kept under wraps, which makes me wonder if there are some dirty secrets hiding behind the pretty plots.

One thing about reading this paper that annoyed me is that all of the technical details (which I care about) are hidden in tiny-print footnotes. Furthermore, all the citations do not include the paper titles, so you can’t tell cited papers are actually about. I know that page space is precious, but it’s just plain stupid. Shame on you, Science. I expected better.

As as amusing postscript, the commentary on this paper and the locally linear embedding paper (Roweis and Saul) written by Seung and Lee has pictures of Bush and Gore in the print edition but due to copyright issues the online version had to be changed.

darjeeling blues

The problem with migrating from bagged tea to loose tea is that you have to use up the residual bags — two lone “Constant Comment” bags from god-knows-when, a smattering of Tazo tea given as a present by a Starbucks-lover who didn’t know any better, and mysterious other bags of unknown provenance. This morning I decided to do my duty and make one of the Twinings Darjeeling bags. The plus side : it seems nearly impossible to oversteep this bag as long as you actually want to drink your tea and don’t forget that you made it. The negative side : it tastes nothing like Darjeeling. I was merely lighter than my usual Twinings standby, the English Breakfast. In contrast, this weekend I tried the Moondakotee Estate FTGFOP1 (Second Flush), which came in a sampler pack, and found it to be delightful — huge leaves and a nice robust body with all the floral notes and so on. I’ve had a few cheaper ones which have also been a pleasure to drink. It’s not a fair comparison, since the loose tea cost about 50% more than the bagged stuff, but does there even exist a decent Darjeeling bag, or should I just use the remaining ones to help my tired eyes?

pybliographer

Today I discovered pybliographer, a decent (if not perfect) BibTeX management tool. Surprisingly, Ubuntu had a package for it already (as does Fedora and Mandrake I think), so it was a breeze (-y badger?) to install. I think I might just start maintaining a huge single BibTeX file and then pull out paper-appropriate subsets as I need them. I’m hoping that they add folders or something to later versions.

When I get my schmancy new Mac laptop I’ll use BibDesk, which looks even better.

three quotes from James Munkres

I found these on a post-it note from his book Topology. I took the class in spring 2000, which feels like ages ago. He ended up writing me a recommendation letter for grad school, so I suppose I have him to thank for where I am now.

  • [2/22/00] (on the topological definition of continuity) “All this exists in the never-never land of abstract set theory.”
  • [3/17/00] “History’s dead.”
  • [4/7/00] “If somebody said I was completely normal, I’d hit them.”

I have more quotes somewhere in my notes for the class, but I have no idea where those are. I have tons of juicy quotes in my probability notes, but those have been AWOL for years, more’s the pity.

mo phat ale

Ok, that’s not the best anagram for Opal Mehta, but it’s the best I could do on short notice. I’ve watched the story unfold over the past weeks, starting with the original Harvard Crimson article, and then all the collective handwringing and schadenfreude. On the one hand, I think she’s a dumb kid who was caught and should pay for it, but not for the rest of her life. On the other, she’s 18, and officially an adult, so I guess she should have expected this. But maybe we should spread the blame around to her money-grubbing producers and the “packaging company” that shares the copyright.

I have to admit that I’m baffled by this essay from Sandip Roy. He, tongue in cheek, thanks Viswanathan for proving “that finally we can fail, that we can screw up spectacularly and live to tell the tale.” He then goes into a lengthy standard complaint about upper middle class Indians in the US, the model minority thing, and overachieving and pushy parents. It’s about the system from within the system, and says nothing about class disparity within the South Asian community in the US, the differences between recent versus established immigrants, the Hindu/Muslim gap, or any of that.

In pointing out how Kaavya-gate (as some are calling it) helps disprove the model minority myth by proving that South Asians aren’t all superhuman superachievers, Roy can be seen to reify that stereotype. Implicit in his “not superhuman” claim we can find “but still high-achievers.” That’s too much, I think. His point is that these pushy parents need to find some perspective. But does the Opal Mehta debacle really point that out? I don’t think so — this lacks the kind of Aristotelian tragic ending that would really send the message home. Roy wants to indict the parents with the child. But to do that we would need some anagnoresis (the tragic hero’s recognition of their own flaw) that comes from them. Instead we have some crap about photographic memories and unintentional internalization. No amount of media spectacle will affect the hordes of pushy parents unless the pushiness itself can be unambiguously blamed.

So Roy’s essay seems off-mark to me. But maybe if I have mo phat ale I’ll start to think differently.