Course Description for ECE 520.674

Information Theoretic Methods in Statistics

Wed 4:00-7:00 PM, Spring 2007 (3 Credits)


Information Theory has been primarily motivated by problems in telecommunication systems, e.g., minimizing the description length of random processes or maximizing the number of distinguishable signals in the presence of noise. This perspective - optimization - has led to a number of insights that contribute to the body of knowledge in Statistics and Probability Theory. This course will discuss some applications of information theoretic methods in statistics.

The course will begin with a very brief introduction to information theory. Notions of entropy, mutual information, and Kullback-Leibler divergence will be reviewed. Their significance in data compression and error correcting codes will be brought out via the source- and channel-coding theorems.

Three application areas will form the bulk of the course:

Participants are expected to be familiar with probability and statistics at the level of an advanced undergraduate or core graduate course. Exposure to information theory (ECE 520.447) will be helpful but not essential.


Course Organisation

Keywords

Prerequisites

Lecture Notes


Instructor: Sanjeev Khudanpur
Office: Barton 221, 410-516-7024
email: MyLastName at jhu dot edu