A few weeks ago I got to go to Bellairs in Holetown, Barbados for a workshop organized by Mike Rabbat and Mark Coates of McGill University. It’s a small workshop, mostly for Mike and Mark’s students, and it’s a chance to interact closely and perhaps start some new research collaborations. Here’s a brief summary of the workshop as I remember it from my notes.
Petar Djuric gave two talks since his colleague Monica Bugallo couldn’t make it. The first talk was on “Sequential signal processing with Dirichlet mixture models” wherein he gave a tutorial on Dirichlet priors and nonparametric Bayesian methods and then used it to address a problem of annotating data from fetal heart rate signals. The nonparametric approach is important because we want to understand what the data suggests about the number of categories with which one could annotate a segment of signals. He also gave Monica’s talk on “Particle Filtering for Complex Systems,” which was a tutorial on particle filters for high-dimensional signals. This is challenging because of a curse of dimensionality — we need too many particles, making the filtering process too complex, computationally. The goal here was to reduce the complexity by generating parallel particle filters that can exchange information when the particles get close (e.g. for multiple target tracking). I am far from knowledgeable about particle filters, but the experimental results looked really compelling.
McGill student Milad Kharratzadeh, talked about a problem of modeling sparse signals which he called “Sparse Multivariate Factor Regression.” Basically they posit latent factors explicitly, so the model is that the outcome , where
are vector inputs and outputs, but the dictionary decomposes into the product of two sparse, rank
matrices. He showed an alternating minimization for the resulting LASSO-like biconvex minimization and then provided some experimental evidence on data from a bikesharing service in Montreal.
Milad was followed by Jun Ye Yu, who talked about how to track a target using sensors that only sense bearings only. For targeting on the Earth, it turns out that the curvature of the surface matters, so his talk was entitled “Distributed Bearing-only Tracking on Spherical Surfaces.” He also used a particle filtering approach for the tracking but had to modify the updates to account for the curvature. It turns out that accounting for this effect has a pretty large impact on the empirical performance of the tracking algorithm.
I gave a talk on my work with Tara Javidi and Anusha Lalitha, based on our recent TAC paper and the journal version of our ISIT paper.
Mikael Johansson gave a talk on distributed optimization with delayed information. A lot of what we know is in sort of extreme versions of asynchrony — totally asynchronous or mostly not-asynchronous. He gave a really nice overview of known results before diving into the open middle ground and what he called “pseudo-contractions” to get some convergence rates for different types of delay. The talk concluded with some discussion of gradient-based algorithms like an asynchronous incremental gradient in a worker/coordinator model, and coordinate descent methods with delay.
Shohreh Shaghaghian described an opportunistic forwarding method for minimizing latency in when forwarding information in “social” networks. Basically each edge has a Poisson process associated to it which says when the two nodes “meet.” The goal is to get packets from source to destination: if I meet a friend, should I give them the packet or wait until I meet someone a bit closer? They show a distributed algorithm for each node to decide which neighbors should get the forwarded packet, and did some experiments from a data set generated at INFOCOM 2005 of real meetings between participants.
Ioannis Bakagiannis discussed some recent work on trying to apply accelerated gradient methods in the kind of optimization methods that Mikael discussed. The results here were pretty preliminary so I don’t want to give too much away… you’ll just have to wait for the paper.
Mark Coates gave a very interesting, but math-heavy, talk on a possible connection between certain approaches to particle filtering called the “particle flow filter,” which is a sort of continuous-time embedding of the Bayes update, and optimal transportation, which also seeks to link two distributions. This was all a bit new to me, but it did inspire me to check out Villani’s Optimal Transport book, which has made some interesting bedtime reading for the last few days.
Michael Rabbat gave a new model for a similarity search problem that looks a bit like associative memories. This reminded me a lot of problems in group testing and multiuser detection, but the model was a bit different. There’s a paper on ArXiV with more details, but basically the idea is to store sums of data points and then when presented with a new point, one can query the sum to find similarities. So the sums are “good enough” for a first round of detection. They did some experiments on image datasets which had promising results.
Finally, Yunpeng Li gave a really interesting talk about designing a “smart bra” that uses microwave pulses to detect breast tumors. Basically there is an array of transmitter/receiver pairs that send pulses and try to detect the presence of tumor tissue, which will have different propagation characteristics than healthy tissue. It’s safe, non-invasive, and potentially a game-changer for cancer screening. The hard part is the very low SNR and the large number of antennas. He basically used some ensemble tricks to get better classification performance. The goal is to get data from real patients now (they have mostly been working with physical “dummy” models) to get real tumor/healthy characteristics.
After all that research, really the only thing to do is to hit the beach and go for a nice swim. It’s a real luxury in February!