The field of NLP is in the midst of a disruptive shift, fueled most recently by the advent of large language models (LLMs), with impacts on our methodologies, funding and public perception. While the core technologies and scope of real-world impact of our field may be changing (everything is different!), many of the same key challenges faced since the inception of our field remain (nothing has changed). In this talk I’ll describe recent work characterizing and tackling some of these challenges, notably: data-efficient domain adaptation and lifelong learning. I will also anchor discussion of cycles and shifts in the field by describing findings from a qualitative study of factors shaping the community over time, including culture, incentives, and infrastructure. Through these complementary lenses into the past, present and future, I aim to inspire shared hope, excitement and discussion.
Emma Strubell is the Raj Reddy Assistant Professor in the Language Technologies Institute in the School of Computer Science at Carnegie Mellon University, and a Visiting Scientist at the Allen Institute for Artificial Intelligence. Previously she held research scientist roles at Google and FAIR after earning her doctoral degree in 2019 from the University of Massachusetts Amherst. Her research lies at the intersection of natural language processing and machine learning, with a focus on providing pragmatic solutions to practitioners who wish to gain insights from natural language text via computation- and data-efficient AI. Her work has been recognized with a Madrona AI Impact Award, best paper awards at ACL and EMNLP, and cited in news outlets including the New York Times and Wall Street Journal.