Following up on Hal Daume’s post and John’s post on cool and interesting things seen at NIPS I’ll post my own little list of neat papers here as well. Of course it’s going to be biased towards what I think is interesting. Also, I have to say that I hadn’t been able to see many papers this year at nips due to myself being too busy, so please feel free to contribute the papers that you liked 🙂
1. P. Mudigonda, V. Kolmogorov, P. Torr. An Analysis of Convex Relaxations for MAP Estimation. A surprising paper which shows that many of the more sophisticated convex relaxations that had been proposed recently turns out to be subsumed by the simplest LP relaxation. Be careful next time you try a cool new convex relaxation!
2. D. Sontag, T. Jaakkola. New Outer Bounds on the Marginal Polytope. The title says it all. The marginal polytope is the set of local marginal distributions over subsets of variables that are globally consistent in the sense that there is at least one distribution over all the variables consistent with all the local marginal distributions. It is an interesting mathematical object to study, and this work builds on the work by Martin Wainwright’s upper bounding the log partition function paper, proposing improved outer bounds on the marginal polytope.
I think there is a little theme going on this year relating approximate inference to convex optimization. Besides the above two papers there were some other papers as well.
3. A. Sanborn, T. Griffiths. Markov Chain Monte Carlo with People. A cute idea of how you can construct an experimental set-up such that people act as accept/reject modules in a Metropolis-Hastings framework, so that we can probe what is the prior distribution encoded in people’s brains.
4. E. Sudderth, M. Wainwright, A. Willsky. Loop Series and Bethe Variational Bounds in Attractive Graphical Models. Another surprising result, that in attractive networks, if loopy belief propagation converges, the Bethe free energy is actually a LOWER bound on the log partition function.
5. M. Welling, I. Porteous, E. Bart. Infinite State Bayes-Nets for Structured Domains. An interesting idea to construct Bayesian networks with infinite number of states, using a pretty complex set-up involving hierarchical Dirichlet processes. I am not sure if the software is out, but I think building such general frameworks for nonparametric models is quite useful for many people who want to use such models but don’t want to spend too much time coding up the sometimes involved MCMC samplers.
I also liked Luis von Ahn’s invited talk on Human Computation. It’s always good see that machine learning has quite a ways to go 🙂
ps: apologies, I stopped maintaining my own blog and ended up losing the domain name. So I’m guest posting here instead.