BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//128.220.36.25//NONSGML kigkonsult.se iCalcreator 2.26.9//
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-FROM-URL:https://www.clsp.jhu.edu
X-WR-TIMEZONE:America/New_York
BEGIN:VTIMEZONE
TZID:America/New_York
X-LIC-LOCATION:America/New_York
BEGIN:STANDARD
DTSTART:20231105T020000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
RDATE:20241103T020000
TZNAME:EST
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20240310T020000
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
RDATE:20250309T020000
TZNAME:EDT
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:ai1ec-21487@www.clsp.jhu.edu
DTSTAMP:20240328T135647Z
CATEGORIES;LANGUAGE=en-US:Seminars
CONTACT:
DESCRIPTION:
Abstract
\nEnormous amounts of ever-changing knowledge are a
vailable online in diverse textual styles and diverse formats. Recent adva
nces in deep learning algorithms and large-scale datasets are spurring pro
gress in many Natural Language Processing (NLP) tasks\, including question
answering. Nevertheless\, these models cannot scale up when task-annotate
d training data are scarce. This talk presents my lab’s work toward buildi
ng general-purpose models in NLP and how to systematically evaluate them.
First\, I present a general model for two known tasks of question answerin
g in English and multiple languages that are robust to small domain shifts
. Then\, I show a meta-training approach that can solve a variety of NLP
tasks with only using a few examples and introduce a benchmark to evaluate
cross-task generalization. Finally\, I discuss neuro-symbolic appr
oaches to address more complex tasks by eliciting knowledge from structure
d data and language models.
\n\nBiography
\n\nHanna Hajishirzi is an Assistant Professor in the Paul G. Allen Schoo
l of Computer Science & Engineering at the University of Washington and a
Senior Research Manager at the Allen Institute for AI. Her research spans
different areas in NLP and AI\, focusing on developing general-purpose mac
hine learning algorithms that can solve many NLP tasks. Applications for t
hese algorithms include question answering\, representation learning\, gre
en AI\, knowledge extraction\, and conversational dialogue. Honors include
the NSF CAREER Award\, Sloan Fellowship\, Allen Distinguished Investigato
r Award\, Intel rising star award\, best paper and honorable mention award
s\, and several industry research faculty awards. Hanna received her PhD f
rom University of Illinois and spent a year as a postdoc at Disney Researc
h and CMU.
DTSTART;TZID=America/New_York:20220225T120000
DTEND;TZID=America/New_York:20220225T131500
LOCATION:Ames Hall 234 - Presented Virtually Via Zoom https://wse.zoom.us/j
/96735183473
SEQUENCE:0
SUMMARY:Hanna Hajishirzi (University of Washington & Allen Institute for AI
) “Toward Robust\, Knowledge-Rich NLP”
URL:https://www.clsp.jhu.edu/events/hanna-hajishirzi-university-of-washingt
on-allen-institute-for-ai-toward-robust-knowledge-rich-nlp/
X-COST-TYPE:free
X-TAGS;LANGUAGE=en-US:2022\,February\,Hajishirzi
END:VEVENT
BEGIN:VEVENT
UID:ai1ec-22395@www.clsp.jhu.edu
DTSTAMP:20240328T135647Z
CATEGORIES;LANGUAGE=en-US:Seminars
CONTACT:
DESCRIPTION:Abstract
\nRecursive call
s over recursive data are widely useful for generating probability distrib
utions\, and probabilistic programming allows computations over these dist
ributions to be expressed in a modular and intuitive way. Exact inference
is also useful\, but unfortunately\, existing probabilistic programming la
nguages do not perform exact inference on recursive calls over recursive d
ata\, forcing programmers to code many applications manually. We introduce
a probabilistic language in which a wide variety of recursion can be expr
essed naturally\, and inference carried out exactly. For instance\, probab
ilistic pushdown automata and their generalizations are easy to express\,
and polynomial-time parsing algorithms for them are derived automatically.
We eliminate recursive data types using program transformations related t
o defunctionalization and refunctionalization. These transformations are a
ssured correct by a linear type system\, and a successful choice of transf
ormations\, if there is one\, is guaranteed to be found by a greedy algori
thm. I will also describe the implementation of this language in two phase
s: first\, compilation to a factor graph grammar\, and second\, computing
the sum-product of the factor graph grammar.
\n
\nBiography
\nDavid Chiang (PhD\, University of Pennsylvania\, 2004) is an assoc
iate professor in the Department of Computer Science and Engineering at th
e University of Notre Dame. His research is on computational models for le
arning human languages\, particularly how to translate from one language t
o another. His work on applying formal grammars and machine learning to tr
anslation has been recognized with two best paper awards (at ACL 2005 and
NAACL HLT 2009). He has received research grants from DARPA\, NSF\, Google
\, and Amazon\, has served on the executive board of NAACL and the editori
al board of Computational Linguistics and JAIR\, and is currently on the e
ditorial board of Transactions of the ACL.
DTSTART;TZID=America/New_York:20221017T120000
DTEND;TZID=America/New_York:20221017T131500
LOCATION:Hackerman Hall B17 @ 3400 N. Charles Street\, Baltimore\, MD 21218
SEQUENCE:0
SUMMARY:David Chiang (University of Notre Dame) “Exact Recursive Probabilis
tic Programming with Colin McDonald\, Darcey Riley\, Kenneth Sible (Notre
Dame) and Chung-chieh Shan (Indiana)”
URL:https://www.clsp.jhu.edu/events/david-chiang-university-of-notre-dame/
X-COST-TYPE:free
X-TAGS;LANGUAGE=en-US:2022\,Chiang\,October
END:VEVENT
END:VCALENDAR