While everyone is silently working on ICML submissions, I found this discussion about a fast physics simulator chip interesting from a learning viewpoint. In many cases, learning attempts to predict the outcome of physical processes. Access to a fast simulator for these processes might be quite helpful in predicting the outcome. Bayesian learning in particular may directly benefit while many other algorithms (like support vector machines) might have their speed greatly increased.
The biggest drawback is that writing software for these odd architectures is always difficult and time consuming, but a several-orders-of-magnitude speedup might make that worthwhile.
Talking of trying to predict physical processes, at times I get really curious
to know more about weather prediction models. The problem seems hard because
of various reasons. Its a spatial prediction problem – u have to predict how the weather will be in a region, or a country, or all over the world. Its an online prediction problem – non-stochastic, but definitely not adverserial. In fact,
weather changes slowly most of the time. If there is any resource somebody can point out on weather prediction and the role played by learning – that will be useful. I am aware that some of the earlier work in statistical callibration for online problems was motivated by the weather example.
I have wondered the same thing.
Physical theories are great, but their ability to predict sometimes relies on an impractically large set of measurements. When only a small fraction of the measurements are available, it’s easy to imagine that a learning-based approach (perhaps guided by the physical theory) would be effective.