http://blog.mrtz.org/2015/03/13/practicing-differential-privacy.html

Btw, your perception of “DP hype” seems bewildering to me. I’ve always found the DP naysayers to greatly outweigh the DP proponents in number and stubbornness.

]]>This cult of epsilon is dangerous, and would lead to multi year winter in all sorts of data mining research. Especially for Medicine where lives at risk this kind of issues have to be taken very seriously.

]]>This paper shows nothing of the sort. I agree that there is a significant amount of theoretical research that may turn up algorithms and approaches that prove to be useless in practice, but this is, in fact, the process of doing research. Presumably in medical research every single experiment turns out to be an amazing new discovery. No? In fact, one goes down dead ends, sometimes for years? Well, I’ll be!

As a matter of fact, I am working with people to try and see how we can implement differential privacy into neuroinformatics analysis pipelines that analyze “real data” from “real subjects.” It’s a question if we can get meaningful values for epsilon. But I’m trying.

If you really are an “actual” medical researcher, you certainly have a very particularized and visceral reaction to CS Theory. I’d posit that the amount of money “wasted” on medical research that uses flawed statistics, falsified data, and improperly documented protocols and analyses would pay for all of the theory grants awarded each year many times over.

]]>Or, as we do in crypto, give up on perfect security and fall back on computational models that assume the hardness of certain factoring problems. The moral equivalent is what will need to happen in practical systems, I think.

]]>