Saddle-point methods for approximate inference in Bayesian belief networks – Fernando J. Pineda (Applied Physics Laboratory at Johns Hopkins University)

When:
November 9, 1999 all-day
1999-11-09T00:00:00-05:00
1999-11-10T00:00:00-05:00

View Seminar Video
Abstract
Bayesian belief networks (BBNs) are a class of joint distribution functions that have a graphical representation. Inference with BBN’s is performed by propagating probabilities throughout the graph via repeated application of Bayes’ rule. Both inference and approximate inference in general BBNs is known to be NP-hard and thus BBNs cannot be applied to large-scale systems without the development of efficient approximate algorithms. The similarity between certain parameterized BBNs (e.g. sigmoid belief networks) and complex physical systems (e.g. the Ising model of ferromagnetism) recently prompted the successful application variational methods to the problem of approximate inference with parameterized BBNs. After a brief overview, this talk will focus on two novel approximate inference algorithms for a class of parameterized belief networks. The algorithms are a consequence of the application of saddle-point methods from statistical physics. The first algorithm yields a previously unknown and easy to calculate upper bound on posterior probabilities. The second algorithm is a Gaussian approximation that is significantly more precise than upper- and lower-bound techniques and takes into account correlations between random variables. If time permits, the proposed application of these algorithms to the problem of rapid microorganism identification via database search will be discussed.
Biography
Dr. Fernando Pineda is a member of the principal profession staff at the Johns Hopkins Applied Physics Laboratory, a part time lecturer in the JHU department of Computer Science, and a collaborator on various research projects in the JHU department of Electrical and Computer Engineering. He has served on the editorial boards of Neural computation, IEEE Transactions on Neural Networks,Neural Networks, Applied Intelligence, and the APL Technical Digest. He has interests in physics, machine learning, neural networks, bioinformatics and analog VLSI.

Center for Language and Speech Processing