Seminars

Nov
16
Fri
David Harrison (Swarthmore College and National Geographic Society) “Endangered Languages” @ Hackerman Hall B17
Nov 16 @ 12:00 pm – 12:15 pm

Abstract

Half of the world’s languages are endangered and may go extinct in this century. The loss of these languages will have dire consequences not only for their speakers, but also for culture, science, and the environment. Around the world, speakers of endangered languages are mounting strategic efforts to save their languages. This presentation features photos and video clips of speakers of some of the world’s most endangered languages, from Siberia, India, the USA and other locations, and will demonstrate how indigenous speakers and linguists are working to sustain languages through technology and digital activism.

Biography

Anthropologist and linguist David Harrison has been a National Geographic Fellow and co-director of the Society’s Enduring Voices Project, documenting endangered languages and cultures around the world. He has done extensive fieldwork with indigenous communities from Siberia and Mongolia to Peru, India, and Australia. His global research is the subject of the acclaimed documentary film The Linguists, and his work has been featured in numerous publications including The New York Times, USA Today, and Science. David is both a professor of linguistics and associate provost for academic programs at Swarthmore College.

 

 

Nov
19
Mon
James Foulds (UMBC) “Differential Fairness for Machine Learning and Artificial Intelligence Systems: Unbiased Decisions with Biased Data” @ Hackerman Hall B17
Nov 19 @ 12:00 pm – 1:15 pm

Abstract

With the rising influence of machine learning algorithms on many important aspects of our daily lives, there are growing concerns that biases inherent in data can lead the behavior of these algorithms to discriminate against certain populations. Biased data can lead data-driven algorithms to produce biased outcomes along lines of gender, race, sexual orientation, and political ties, with important real-world consequences, including decision-making for lending and law enforcement. Thus, there is an urgent need for machine learning algorithms that make unbiased decisions with biased data. We propose a novel framework for measuring and correcting bias in data-driven algorithms, with inspiration from privacy-preserving machine learning and Bayesian probabilistic modeling. A case study on census data demonstrates the utility of our approach.

Biography

Dr. James Foulds is an Assistant Professor in the Department of Information Systems at UMBC. His research interests are in both applied and foundational machine learning, focusing on probabilistic latent variable models and the inference algorithms to learn them from data. His work aims to promote the practice of probabilistic modeling for computational social science, and to improve AI’s role in society regarding privacy and fairness. He earned his Ph.D. in computer science at the University of California, Irvine, and was a postdoctoral scholar at the University of California, Santa Cruz, followed by the University of California, San Diego. His master’s and bachelor’s degrees were earned with first class honours at the University of Waikato, New Zealand, where he also contributed to the Weka data mining system.

Nov
26
Mon
Spence Green (Lilt) @ Hackerman Hall B17
Nov 26 @ 12:00 pm – 1:15 pm
Nov
30
Fri
Yi-Chia Wang (CMU) @ Hackerman Hall B17
Nov 30 @ 12:00 pm – 1:15 pm

Center for Language and Speech Processing