Short Course Description
Information measures: self information of a message, entropy of message ensemble, conditional entropy, mutual information and information divergence. Discrete information sources: entropy of stationary discrete source, entropy rate. Lossless source coding: block coding and variable length coding. Discrete channels: Channel capacity, the converse of channel coding theorem and Fano's inequality. Channel coding theorem. Information measures for continuous-values ensembles, the differential entropy. Channel capacity with average power constraint. Channel capacity with additive white Gaussian noise and colored Gaussian source. Rate-distortion function of information source. Source coding with distortion. The general theorem of information theory for point-to-point communication. Introduction to multi-user information theory.
Full syllabus is to be published