{"id":206,"date":"2006-07-05T07:02:08","date_gmt":"2006-07-05T13:02:08","guid":{"rendered":"http:\/\/hunch.net\/?p=206"},"modified":"2006-07-05T07:02:45","modified_gmt":"2006-07-05T13:02:45","slug":"more-icml-papers","status":"publish","type":"post","link":"https:\/\/hunch.net\/?p=206","title":{"rendered":"more icml papers"},"content":{"rendered":"<p>Here are a few other papers I enjoyed from ICML06.<\/p>\n<p>Topic Models:<\/p>\n<ul>\n<li><a href=\"http:\/\/www.icml2006.org\/icml_documents\/camera-ready\/015_Dynamic_Topic_Models.pdf\"><br \/>\nDynamic Topic Models<\/a><br \/>\nDavid Blei, John Lafferty<br \/>\nA nice model for how topics in LDA type models can evolve over time,<br \/>\nusing a linear dynamical system on the natural parameters and a very<br \/>\nclever structured variational approximation (in which the mean field<br \/>\nparameters are pseudo-observations of a virtual LDS). Like all Blei<br \/>\npapers, he makes it look easy, but it is extremely impressive.<\/li>\n<li><a href=\"http:\/\/www.icml2006.org\/icml_documents\/camera-ready\/073_Pachinko_Allocation.pdf\"><br \/>\nPachinko Allocation<\/a><br \/>\nWei Li, Andrew McCallum<br \/>\nA very elegant (but computationally challenging) model which induces<br \/>\ncorrelation amongst topics using a multi-level DAG whose interior nodes<br \/>\nare &#8220;super-topics&#8221; and &#8220;sub-topics&#8221; and whose leaves are the<br \/>\nvocabulary words. Makes the slumbering monster of structure learning stir.<\/li>\n<\/ul>\n<p>Sequence Analysis (I missed these talks since I was chairing another session)<\/p>\n<ul>\n<li><a href=\"http:\/\/www.icml2006.org\/icml_documents\/camera-ready\/083_Online_Decoding_of_M.pdf\"><br \/>\nOnline Decoding of Markov Models with Latency Constraints<\/a><br \/>\nMukund Narasimhan, Paul Viola, Michael Shilman<br \/>\nAn &#8220;ah-ha!&#8221; paper showing how to trade off latency and decoding<br \/>\naccuracy when doing MAP labelling (Viterbi decoding) in sequential<br \/>\nMarkovian models. You&#8217;ll wish you thought of this yourself.<\/li>\n<li><a href=\"http:\/\/www.icml2006.org\/icml_documents\/camera-ready\/100_Efficient_Inference.pdf\"><br \/>\nEfficient inference on sequence segmentation model<\/a><br \/>\nSunita Sarawagi<br \/>\nA smart way to re-represent potentials in segmentation models<br \/>\nto reduce the complexity of inference from cubic in the input sequence<br \/>\nto linear. Also check out her NIPS2004 paper with William Cohen<br \/>\non &#8220;segmentation CRFs&#8221;. Moral of the story: segmentation is NOT just<br \/>\nsequence labelling.<\/li>\n<\/ul>\n<p>Optimal Partitionings\/Labellings<\/p>\n<ul>\n<li><a href=\"http:\/\/www.icml2006.org\/icml_documents\/camera-ready\/079_The_Uniqueness_of_a.pdf\"><br \/>\nThe uniqueness of a good optimum for K-means<\/a><br \/>\nMarina Meila<br \/>\nMarina shows a stability result for K-means clustering, namely<br \/>\nthat if you find a &#8220;good&#8221; clustering it is not too &#8220;different&#8221; than the<br \/>\n(unknowable) optimal clustering and that all other good clusterings<br \/>\nare &#8220;near&#8221; it. So, don&#8217;t worry about local minima in K-means as long<br \/>\nas you get a low objective.\n<\/li>\n<li><a href=\"http:\/\/www.icml2006.org\/icml_documents\/camera-ready\/093_Quadratic_Programmin.pdf\"><br \/>\nQuadratic Programming Relaxations for Metric Labeling and Markov Random Field MAP Estimation<\/a><br \/>\nPradeep Ravikumar, John Lafferty<br \/>\nParadeep and John introduce QP relaxations for the problem of finding<br \/>\nthe best joint labelling of a set of points (connected by a weighted<br \/>\ngraph and with a known metric cost between labels and extended<br \/>\nthe non-metric case). Surprisingly, they show that the QP relaxation<br \/>\nis both computationally more attractive and more accurate than<br \/>\nthe &#8220;natural&#8221; LP relaxation or than loopy BP approximations.<\/li>\n<\/ul>\n<p>Distinguished Paper Award Winners<\/p>\n<ul>\n<li><a href=\"http:\/\/www.icml2006.org\/icml_documents\/camera-ready\/095_How_Boosting_the_Mar.pdf\"><br \/>\nHow Boosting the Margin Can Also Boost Classifier Complexity<\/a><br \/>\nLev Reyzin, Robert Schapire<\/li>\n<li><a href=\"http:\/\/www.icml2006.org\/icml_documents\/camera-ready\/026_Trading_Convexity_fo.pdf\"><br \/>\nTrading Convexity for Scalability<\/a><br \/>\nRonan Collobert, Fabian Sinz, Jason Weston, Leon Bottou<\/li>\n<li><a href=\"http:\/\/www.icml2006.org\/icml_documents\/camera-ready\/052_Looping_Suffix_Tree.pdf\"><br \/>\nLooping Suffix Tree-Based Inference of Partially Observable Hidden State<\/a><br \/>\nMichael Holmes, Charles Isbell<\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Here are a few other papers I enjoyed from ICML06. Topic Models: Dynamic Topic Models David Blei, John Lafferty A nice model for how topics in LDA type models can evolve over time, using a linear dynamical system on the natural parameters and a very clever structured variational approximation (in which the mean field parameters &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/hunch.net\/?p=206\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;more icml papers&#8221;<\/span><\/a><\/p>\n","protected":false},"author":17,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[29,18],"tags":[],"class_list":["post-206","post","type-post","status-publish","format-standard","hentry","category-machine-learning","category-papers"],"_links":{"self":[{"href":"https:\/\/hunch.net\/index.php?rest_route=\/wp\/v2\/posts\/206","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/hunch.net\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/hunch.net\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/hunch.net\/index.php?rest_route=\/wp\/v2\/users\/17"}],"replies":[{"embeddable":true,"href":"https:\/\/hunch.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=206"}],"version-history":[{"count":0,"href":"https:\/\/hunch.net\/index.php?rest_route=\/wp\/v2\/posts\/206\/revisions"}],"wp:attachment":[{"href":"https:\/\/hunch.net\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=206"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/hunch.net\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=206"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/hunch.net\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=206"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}