Atomic Learning Workshop

Date March 24 & 25, 2006 (changed!)
Location TTI-Chicago, 1427 E 60th street, Chicago, Illinois
University of Chicago Campus
local map
general maps
Topic Real-world applications of machine learning often require learning predictors which use complex (i.e. not linear) dependencies on input features. A number of architectures have been proposed for learning these including (but certainly not limited to):
  1. Decision Trees
  2. Convolutional Neural Networks
  3. Deep belief networks
  4. Several learning algorithms applied in computational linguistics.
A trait of all of these algorithm is the repeated use of basic predictive atoms (decision tree::decision stump, convolutional neural network::neuron, deep belief network::logistic regression). Basic questions about this approach include:
  1. What predictive atoms are viable? Is there a notion of 'universally expressive' atoms?
  2. What are the principles for composition of atoms?
  3. What are the principles for tractable learning over atomic architectures? Can global optimization techniques (such as for SVM-like optimizations) scale?
  4. For what problems is this approach essential? useful?
  5. What common properties do such architectures have? Can we design new systems on a theoretical basis and expect them to work?
additional discussion
Funding TTI-Chicago has funds to cover transport and lodging costs for many (at least 10) participants.
Accomodations The Quadrangle Club and International House both provide local rooms. TTI-Chicago will reserve rooms to help with local accomodations.
Plan Send me a title and abstract if you are interested in talking. The rough plan is to have relatively long talks with lots of discussion over the two days of the workshop. The 'heavy discussion' format is essential because the workshop is drawing together people who have worked on very different mechanisms for accomplishing the same thing.