ECE 776: Information Theory
Tentative Schedule
Date
|
Major topic
|
Keywords
|
Section textbook
|
Homework/Handouts
|
Jan. 25
|
1. Information measures and basic tools
|
-
Overview and motivation
-
Repetition: probability
-
Entropy, mutual information
-
Inequalities (Jensen's, Fano, log-sum, data processing), sufficient statistics
-
Typical sets, Asymptotic Equipartition Property
-
Markov chains, entropy rates
|
1, 2, 3, 4.1-4.3, 4.5
|
|
Feb. 1
|
Homework 1
|
Feb. 8
|
|
Feb. 15
|
Homework 2
|
Feb. 22
|
2. Lossless source coding
|
-
Source coding theorem
-
Kraft inequality, Huffman codes
|
5.1-5.4, 5.6, 5.8
|
Homework 3
|
Feb. 29
|
3. Noisy channel coding
|
-
Channels and channel capacity
-
Channel coding and error probability
-
Channel coding theorem
-
Channels with feedback
-
Separation of source and channel coding
-
Joint source-channel coding
|
7.1-7.7, 7.9-7.10, 7.12-7.13 Gallager: Chapter 5
|
Homework 4
|
Mar. 7
|
|
Mar. 21
|
Midterm
|
|
|
|
Mar. 28
|
3. Noisy channel coding
|
-
Channels and channel capacity
-
Channel coding and error probability
-
Channel coding theorem
-
Channels with feedback
-
Separation of source and channel coding
-
Joint source-channel coding
|
7.1-7.7, 7.9-7.10, 7.12-7.13 Gallager: Chapter 5
|
Homework 5
|
Apr. 4
|
Homework 6
|
Apr. 11
|
4. Gaussian channels
|
-
Differential entropy
-
Scalar Gaussian channels
-
Gaussian channels with feedback
|
8.1, 8.3, 8.5, 9
|
|
Apr. 18
|
Homework 7
|
Apr. 25
|
5. Lossy source coding
|
-
Single letter distortion measures
-
Rate-distortion theorem
-
Rate-distortion function
|
10.1-10.5
|
|
May 2
|
Homework 8
|