Course Description for ECE 520.674
Information Theoretic Methods in Statistics
Wed 4:00-7:00 PM, Spring 2007 (3 Credits)
Information Theory has been primarily motivated by
problems in telecommunication systems, e.g., minimizing the
description length of random processes or maximizing the number of
distinguishable signals in the presence of noise. This perspective -
optimization - has led to a number of insights that contribute to the
body of knowledge in Statistics and Probability Theory. This course
will discuss some applications of information theoretic methods in
The course will begin with a very brief introduction to
information theory. Notions of entropy, mutual information, and
Kullback-Leibler divergence will be reviewed. Their significance in
data compression and error correcting codes will be brought out via
the source- and channel-coding theorems.
Three application areas will form the bulk of the course:
Information Geometry: The notion of I-projections will be
introduced. Examples will be drawn from problems encountered in
maximum entropy and maximum likelihood estimation, the EM algorithm,
etc. Projection of a probability mass function onto a linear
family of probability mass functions, and iterative algorithms for
finding the minimum divergence between two convex sets of probability
mass functions will be studied.
Large Deviations: Error exponents, Sanov's theorem and related
results will be presented from an information theoretic viewpoint. The
role of I-divergence will be highlighted, and error bounds for
selected estimation problems will be presented as examples.
Redundancy and Data Compression: Lossy and lossless compression
of data will be investigated. In particular, Rissanen's Minimum
Description Length (MDL) principle, and other model based techniques
for universal lossless data compression will be studied, and
connections with statistical modelling will be explored.
Participants are expected to be familiar with probability and
statistics at the level of an advanced undergraduate or core graduate
course. Exposure to information theory (ECE 520.447) will
be helpful but not essential.
Instructor: Sanjeev Khudanpur
Office: Barton 221, 410-516-7024
email: MyLastName at jhu dot edu