**Abstract**

The use of Bayesian method s in large-scale data settings is attractive because of the rich hierarchi cal relationships\, uncertainty quantification\, and prior specification t hese methods provide. Many standard Bayesian inference algorithms are ofte n computationally expensive\, however\, so their direct application to lar ge datasets can be difficult or infeasible. Other standard algorithms sacr ifice accuracy in the pursuit of scalability. We take a new approach. Nam ely\, we leverage the insight that data often exhibit approximate redundan cies to instead obtain a weighted subset of the data (called a “coreset”) that is much smaller than the original dataset. We can then use this small coreset in existing Bayesian inference algorithms without modification. W e provide theoretical guarantees on the size and approximation quality of the coreset. In particular\, we show that our method provides geometric de cay in posterior approximation error as a function of coreset size. We val idate on both synthetic and real datasets\, demonstrating that our method reduces posterior approximation error by orders of magnitude relative to u niform random subsampling.

\n**Biography**

Tama ra Broderick is the ITT Career Development Assistant Professor in the Depa rtment of Electrical Engineering and Computer Science at MIT. She is a mem ber of the MIT Computer Science and Artificial Intelligence Laboratory (CS AIL)\, the MIT Statistics and Data Science Center\, and the Institute for Data\, Systems\, and Society (IDSS). She completed her Ph.D. in Statistics at the University of California\, Berkeley in 2014. Previously\, she rece ived an AB in Mathematics from Princeton University (2007)\, a Master of A dvanced Study for completion of Part III of the Mathematical Tripos from t he University of Cambridge (2008)\, an MPhil by research in Physics from t he University of Cambridge (2009)\, and an MS in Computer Science from the University of California\, Berkeley (2013). Her recent research has focus ed on developing and analyzing models for scalable Bayesian machine learni ng. She has been awarded an NSF CAREER Award (2018)\, a Sloan Research Fel lowship (2018)\, an Army Research Office Young Investigator Program award (2017)\, Google Faculty Research Awards\, the ISBA Lifetime Members Junior Researcher Award\, the Savage Award (for an outstanding doctoral disserta tion in Bayesian theory and methods)\, the Evelyn Fix Memorial Medal and C itation (for the Ph.D. student on the Berkeley campus showing the greatest promise in statistical research)\, the Berkeley Fellowship\, an NSF Gradu ate Research Fellowship\, a Marshall Scholarship\, and the Phi Beta Kappa Prize (for the graduating Princeton senior with the highest academic avera ge).

\n DTSTART;TZID=America/New_York:20181026T120000 DTEND;TZID=America/New_York:20181026T131500 GEO:+39.330496;-76.620046 LOCATION:Hackerman Hall B17 @ 3400 N Charles St\, Baltimore\, MD 21218\, US A SEQUENCE:0 SUMMARY:Tamara Broderick (MIT) “Automated Scalable Bayesian Inference via D ata Summarization” URL:https://www.clsp.jhu.edu/events/tamara-broderick-mit/ X-COST-TYPE:free X-TAGS;LANGUAGE=en-US:2018\,Broderick\,October END:VEVENT END:VCALENDAR