Machine Learning Reductions Workshop

Date September 11 & 12, 2003
Location TTI-Chicago
University of Chicago Campus
Chicago, Illinois
general maps
Topic Reductions, are constructs allowing us to transfer theory, algorithms, and expertise from one machine learning problem to another problem. From a theoretical point of view, reductions are particularly interesting because it appears that they often end up working well in practice. (and Risi notes: From a practical point of view, reductions are particularly interesting because it appears that they often end up working well in theory.)

Examples of practical reductions include:

  1. Boosting (Strong classification with weak classification).
  2. Error Correcting Output Codes (multiclass classification with binary classification).
  3. Various reductions of reinforcement learning to regression, classification, etc...
  4. Canonical reductions of classification to density estimation from statistics.
We are, of course, interested in reductions in a wider sense than just these examples. Obvious questions to address are:
  1. What empirical evidence do we have for and against reductions?
  2. How can we differentiate between "good" reductions and "bad" reductions?
  3. Can we understand the difficulty of learning problems by their placement in a reduction hierarchy?
  4. What is a mapping of available reductions?
  5. What learning problems are unrelated by reduction?
  6. What reduction are not possible?
A focused workshop involving those who have worked on reductions might be very fruitful in addressing these questions and laying the groundwork for a coherent direction of research.
Funding TTI-Chicago has funds to cover costs up to $800 each for up to 20 people. The 20 will be allocated first-come-first-serve.
Accomodations David McAllester has reserved 10 rooms at the Quadrangle Club for September 10, 11, and 12.
Plan A schedule.