Machine Learning (Theory)

1/26/2016

New York Machine Learning Deadlines

There’s a number of different Machine Learning related paper deadlines that may interest.

January 29 (abstract) for March 4 New York ML Symposium Register early because NYAS can only fit 300.
January 27 (abstract)/February 2 (paper) for July 9-15 IJCAI The biggest AI conference
February 5(paper) for June 19-24 ICML Nina and Kilian have 850 well-vetted reviewers. Marek and Peder have increased space to allow 3K people.
February 12(paper) for June 23-26 COLT Vitaly and Sasha are program chairs.
February 12(proposal) for June 23-24 ICML workshops Fei and Ruslan are the workshop chairs. I really like workshops.
February 19(proposal) for June 19 ICML tutorials Bernhard and Alina have invited a few tutorials already but are saving space for good proposals as well.
March 1(paper) for June 25-29 UAI Jersey City isn’t quite New York, but it’s close enough :-)
May ~2 for June 23-24 ICML workshops Varies with the workshop.

11/29/2015

CNTK and Vowpal Wabbit tutorials at NIPS

Both CNTK and Vowpal Wabbit have pirate tutorials at NIPS. The CNTK tutorial is 1 hour during the lunch break of the Optimization workshop while the VW tutorial is 1 hour during the lunch break of the Extreme Multiclass workshop. Consider dropping by either if interested.

CNTK is a deep learning system started by the speech people who started the deep learning craze and grown into a more general platform-independent deep learning system. It has various useful features, the most interesting of which is perhaps efficient scalable training. Using GPUs with allreduce and one-bit sgd it achieves both high efficiency and scalability over many more GPUs than could ever fit into a single machine. This capability is unique amongst all open deep learning codebases so everything else looks nerfed in comparison. CNTK was released in April so this is the first chance for many people to learn about it. See here for more details.

The Vowpal Wabbit tutorial just focuses on what is new this year.

  1. The learning to search framework has greatly matured and is now easily used to solve ad-hoc joint(structured) prediction problems. The ICML tutorial covers algorithms/analysis so this is about using the system.
  2. VW has also become the modeling element of a larger system (called the decision service) which gathers data and uses it as per Contextual Bandit learning. This is now generally usable, and is the first general purpose system of this sort.

10/30/2015

ICML 2016 in NYC and KDD Cup 2016

ICML 2016 is in New York City. I expect it to be the largest ICML by far given the destination—New York is the place which is perhaps easiest to reach from anywhere in the world and it has the largest machine learning meetup anywhere in the world.

I am the general chair this year, which is light in work but heavy in responsibilities. Some things I worry about:

  1. How many people will actually come? Numbers are difficult to guess with the field growing and the conference changing locations. I believe we need capacity for at least 3000 people based on everything I know.
  2. New York is expensive. What can be done about it? One thought is that we should actively setup a roommate finding system so the costs of hotels can be shared. Up to 3 people can share a hotel room for the conference hotel (yes, each with their own bed), and that makes the price much more reasonable. I’m also hoping donations will substantially defray the cost. If others have creative ideas, I’m definitely interested.

Markus Weimer also points out the 2016 KDD Cup which has a submission deadline of December 6. KDD Cup datasets have become common reference for many machine learning papers, so this is a good way to get your problem solved well by many people.

12/6/2014

Vowpal Wabbit 7.8 at NIPS

I just created Vowpal Wabbit 7.8, and we are planning to have an increasingly less heretical followup tutorial during the non-“ski break” at the NIPS Optimization workshop. Please join us if interested.

I always feel like things are going slow, but in the last year, but there have been many changes overall. Notes for 7.7 are here. Since then, there are several areas of improvement as well as generalized bug fixes and refactoring.

  1. Learning to Search: Hal completely rewrote the learning to search system, enough that the numbers here are looking obsolete. Kai-Wei has also created several advanced applications for entity-relation and dependency parsing which are promising.
  2. Languages Hal also created a good python library, which includes call-backs for learning to search. You can now develop advanced structured prediction solutions in a nice language. Jonathan Morra also contributed an initial Java interface.
  3. Exploration The contextual bandit subsystem now allows evaluation of an arbitrary policy, and an exploration library is now factored out into an independent library (principally by Luong with help from Sid and Sarah). This is critical for real applications because randomization must happen at the point of decision.
  4. Reductions The learning reductions subsystem has continued to mature, although the perfectionist in me is still dissatisfied. As a consequence, it’s now pretty easy to program new reductions, and the efficiency of these reductions has generally improved. Several news ones are cooking.
  5. Online Learning Alekh added an online SVM implementation based on LaSVM. This is known to parallelize well via the para-active approach.

This project has grown quite a bit—there are about 30 different people contributing to VW since the last release, and there is now a VW meetup (December 8th!) in the bay area that I wish I could attend.

7/26/2014

Open Machine Learning Workshop, August 22

On August 22, we are planning to have an Open Machine Learning Workshop at MSR, New York City taking advantage of CJ Lin and others in town for KDD.

If you are interested, please email msrnycrsvp at microsoft.com and say “I want to come” so we can get a count of attendees for refreshments.

Added: Videos are now online.

« Newer PostsOlder Posts »

Powered by WordPress