**Abstr
act**

The use of Bayesian methods in large-scale data setti ngs is attractive because of the rich hierarchical relationships\, uncerta inty quantification\, and prior specification these methods provide. Many standard Bayesian inference algorithms are often computationally expensive \, however\, so their direct application to large datasets can be difficul t or infeasible. Other standard algorithms sacrifice accuracy in the pursu it of scalability. We take a new approach. Namely\, we leverage the insig ht that data often exhibit approximate redundancies to instead obtain a we ighted subset of the data (called a “coreset”) that is much smaller than t he original dataset. We can then use this small coreset in existing Bayesi an inference algorithms without modification. We provide theoretical guara ntees on the size and approximation quality of the coreset. In particular\ , we show that our method provides geometric decay in posterior approximat ion error as a function of coreset size. We validate on both synthetic and real datasets\, demonstrating that our method reduces posterior approxima tion error by orders of magnitude relative to uniform random subsampling.< /p>\n

**Biography**

Tamara Broderick is the ITT Car eer Development Assistant Professor in the Department of Electrical Engine ering and Computer Science at MIT. She is a member of the MIT Computer Sci ence and Artificial Intelligence Laboratory (CSAIL)\, the MIT Statistics a nd Data Science Center\, and the Institute for Data\, Systems\, and Societ y (IDSS). She completed her Ph.D. in Statistics at the University of Calif ornia\, Berkeley in 2014. Previously\, she received an AB in Mathematics f rom Princeton University (2007)\, a Master of Advanced Study for completio n of Part III of the Mathematical Tripos from the University of Cambridge (2008)\, an MPhil by research in Physics from the University of Cambridge (2009)\, and an MS in Computer Science from the University of California\, Berkeley (2013). Her recent research has focused on developing and analyz ing models for scalable Bayesian machine learning. She has been awarded an NSF CAREER Award (2018)\, a Sloan Research Fellowship (2018)\, an Army Re search Office Young Investigator Program award (2017)\, Google Faculty Res earch Awards\, the ISBA Lifetime Members Junior Researcher Award\, the Sav age Award (for an outstanding doctoral dissertation in Bayesian theory and methods)\, the Evelyn Fix Memorial Medal and Citation (for the Ph.D. stud ent on the Berkeley campus showing the greatest promise in statistical res earch)\, the Berkeley Fellowship\, an NSF Graduate Research Fellowship\, a Marshall Scholarship\, and the Phi Beta Kappa Prize (for the graduating P rinceton senior with the highest academic average).

\n\n X-TAGS;LANGUAGE=en-US:2018\,Broderick\,October END:VEVENT END:VCALENDAR