# Braised mizuna and oyster mushrooms

I am headed out of town tomorrow but I wanted to cook up my ill-advised gains from the Logan Square farmer’s market — mizuna and oyster mushrooms. I was a bit inspired by this ohitashi variation, but wanted something a bit more hearty to eat with soba. So I decided to braise the greens with ginger and dashi. This recipe may need tweaking depending on the saltiness of your dashi, etc.

Braised oyster mushrooms, turnips, and mizuna over soba

Ingredients
4 medium Japanese turnips, sliced thinly
1/2 – 1 lb oyster mushrooms, sliced
1 bunch mizuna

3 tbsp diced or grated ginger
3 tbsp mirin or sake
2/3 cup dashi (from scratch or bottle)
2 tbsp soy sauce
peanut oil

cooked soba (buckwheat) noodles.

Lightly coat wok/pan with oil and cook turnips on medium-high until softened and some are lightly browned. Remove turnips and add a little bit more oil and cook mushrooms until they soften and give up liquid. Add turnips and mix. Add mirin/sake and mix well until it cooks off. Make a space in the middle, add a little more oil and cook ginger until aromatic, then mix everything. Add mizuna and mix, then add dashi and soy sauce. Simmer until broth reduces and mizuna wilts, but not too long. Serve over soba.

# Toolkit revisited

I joined TTI Chicago almost a year ago, and it’s been an interesting time here. Since my background is a bit different from most of the other folks here, I have many moments of “academic cognitive dissonance” as it were — but more on that later. Madhur Tulsiani is going to offer a toolkit course in the spring focusing on mathematical tools for CS theory — I wanted to revisit a topic from a few years ago, namely what an EE-systems/theory “toolkit” would look like. I think a similar course / seminar would be really handy (even for self-study), but the topics we came up with before seem a little dated now. It seems like the topics fall under a few categories

• advanced stochastic processes : stochastic approximation
• mathematical economics : game theory, auctions, mechanism design
• advanced probability : concentration of measure, random graphs
• optimization : stochastic control, dynamic programming, convex optimization
• mathematical statistics : asymptotic statistics, minimax theory

Roy’s observation is that these topics are already covered in graduate syllabi is still apt. But I still think that knowing a smattering of these topics is still important for general literacy and critical reading of papers. In reading a new paper I first situate the techniques within the context of things I know about — if I have to absorb the author’s cursory description of the general method as well as its application to the problem at hand, I get bogged down in the former and find the latter mystifying.

Actually, I think what would be great is to make tutorials on the topics and gather them together. I know that people who make research tutorials spend a lot of time on them and there’s some reluctance to gather them together, but these topics are not bleeding edge and could be part of a course. It’s sort of like Connexions, but perhaps a little less wiki-like and more lecture-notes like. What would be the best way to do that?

As an aside, Madhur is also thinking of doing a more focused course later which would cover coding and information theory for (theoretical) computer scientists. I’ve thought a fair bit about such a course focused on machine learning — focusing a bit more on statistical issues like redundancy and Sanov’s theorem instead of Gaussian channels. But how could one do an information theory course without $\frac{1}{2} \log( 1 + \mathsf{SNR} )$?