My greatest concern with the many machine learning conferences in New York this year was the relatively high cost that implied, particularly for hotel rooms in Manhattan. Keeping the conference affordable for graduate students seems critical to what ICML is really about.
The price becomes much more reasonable if you can find roommates to share the price. For example, the conference hotel can have 3 beds in a room.
This still leaves a coordination problem: How do you find plausible roommates? If only there was a website where the participants in a conference could look for roommates. Oh wait, there is. Conferenceshare.co is something new which might measurably address the cost problem. Obviously, you’ll want to consider roommate possibilities carefully, but now at least there is a place to meet.
Note that the early registration deadline for ICML is May 7th.
I’m doing a Quora Session today that may be of interest. I’m impressed with both the quality and quantity of questions.
Here. I would recommend registering early because there is a difficult to estimate(*) chance you will not be able to register later.
The program is shaping up and should be of interest. The 9 Tutorials(**), 4 Invited Speakers, and 23 Workshops are all chosen, with paper decisions due out in a couple weeks.
||Full (after May 7)
These numbers are as aggressively low as the local chairs and I can sleep with at night. The prices are higher than I’d like (New York is expensive), but a bit lower than last year, particularly for students(***).
(*) Relevant facts:
- ICML 2016: submissions up 30% to 1300.
- NIPS 2015 in Montreal: 3900 registrations (way up from last year).
- NIPS 2016 is in Barcelona.
- ICML 2015 in Lille: 1670 registrations.
- KDD 2014 in NYC: closed@3000 registrations 1 week before the conference.
I tried to figure out how to setup a prediction market to estimate what will happen this year, but didn’t find an easy-enough way to do that.
(**) I kind of wish we could make up the titles. How about: “Go is Too Easy” and “My Neural Network is Deeper than Yours”?
(***) Sponsors are very generous and are mostly giving to defray student costs. Approximately every dollar of the difference between Regular and Student registration is due to company donations. For students, also note that there will be some scholarship opportunities to defray costs coming out soon.
There’s a number of different Machine Learning related paper deadlines that may interest.
|January 29 (abstract) for March 4
||New York ML Symposium
||Register early because NYAS can only fit 300.
|January 27 (abstract)/February 2 (paper) for July 9-15
||The biggest AI conference
|February 5(paper) for June 19-24
||Nina and Kilian have 850 well-vetted reviewers. Marek and Peder have increased space to allow 3K people.
|February 12(paper) for June 23-26
||Vitaly and Sasha are program chairs.
|February 12(proposal) for June 23-24
||Fei and Ruslan are the workshop chairs. I really like workshops.
|February 19(proposal) for June 19
||Bernhard and Alina have invited a few tutorials already but are saving space for good proposals as well.
|March 1(paper) for June 25-29
||Jersey City isn’t quite New York, but it’s close enough
|May ~2 for June 23-24
||Varies with the workshop.
Both CNTK and Vowpal Wabbit have pirate tutorials at NIPS. The CNTK tutorial is 1 hour during the lunch break of the Optimization workshop while the VW tutorial is 1 hour during the lunch break of the Extreme Multiclass workshop. Consider dropping by either if interested.
CNTK is a deep learning system started by the speech people who started the deep learning craze and grown into a more general platform-independent deep learning system. It has various useful features, the most interesting of which is perhaps efficient scalable training. Using GPUs with allreduce and one-bit sgd it achieves both high efficiency and scalability over many more GPUs than could ever fit into a single machine. This capability is unique amongst all open deep learning codebases so everything else looks nerfed in comparison. CNTK was released in April so this is the first chance for many people to learn about it. See here for more details.
The Vowpal Wabbit tutorial just focuses on what is new this year.
- The learning to search framework has greatly matured and is now easily used to solve ad-hoc joint(structured) prediction problems. The ICML tutorial covers algorithms/analysis so this is about using the system.
- VW has also become the modeling element of a larger system (called the decision service) which gathers data and uses it as per Contextual Bandit learning. This is now generally usable, and is the first general purpose system of this sort.