skip to main content
EE/Ma/CS 126 ab
Information Theory
9 units (3-0-6)  | first, second terms
Prerequisites: Ma 3.

Shannon's mathematical theory of communication, 1948-present. Entropy, relative entropy, and mutual information for discrete and continuous random variables. Shannon's source and channel coding theorems. Mathematical models for information sources and communication channels, including memoryless, Markov, ergodic, and Gaussian. Calculation of capacity and rate-distortion functions. Universal source codes. Side information in source coding and communications. Network information theory, including multiuser data compression, multiple access channels, broadcast channels, and multiterminal networks. Discussion of philosophical and practical implications of the theory. This course, when combined with EE 112, EE/Ma/CS/IDS 127, EE/CS 161, and EE/CS/IDS 167, should prepare the student for research in information theory, coding theory, wireless communications, and/or data compression.

Instructors: Effros, Hamkins