Luke Zettlemoyer (University of Washington) “End-to-End Deep Learning for Broad Coverage Semantics: SRL, Conference and Beyond”
3400 N Charles St, Baltimore, MD 21218
Deep learning with large supervised training sets has had significant impact on many research challenges, from speech recognition to machine translation. However, applying these ideas to problems in computational semantics has been difficult, at least in part due to modest dataset sizes and relatively complex structured prediction tasks.
In this talk, I will present two recent results on end-to-end deep learning for classic challenge problems in computational semantics: semantic role labeling and coreference resolution. In both cases, we will introduce relative simple deep neural network approaches that use no preprocessing (e.g. no POS tagger or syntactic parser) and achieve significant performance gains, including over 20% relative error reductions when compared to non-neural methods. I will also briefly discuss our efforts for crowdsourcing large new datasets that should, in the very near future, provide orders of magnitude more data for training such models. Our hope is that these two advances, when combined, will enable very high quality semantic analysis in any domain from easily gathered supervision.
This is joint work with Luheng He, Kenton Lee, and Mike Lewis
Luke Zettlemoyer is an Associate Professor in the Paul G. Allen School of Computer Science & Engineering at the University of Washington, and also leads the AllenNLP project at the Allen Institute for Artificial Intelligence. His research focuses on empirical computational semantics, and involves designing machine learning algorithms and building large datasets. Honors include multiple paper awards, a PECASE award, and an Allen Distinguished Investigator Award. Luke received his PhD from MIT and was a postdoc at the University of Edinburgh.