This course deals with fundamental limits on communication, including the following topics: entropy, relative entropy, and mutual information: entropy rates for stochastic processes; differential entropy; data compression; the Kraft inequality; Shannon-Fano codes; Huffman codes; arithmetic coding; channel capacity; discrete channels; the random coding bound and its converse; the capacity of Gaussian channels; the sphere-packing bound; coloured Gaussian noise and water-filling; rate-distortion theory; the rate-distortion function; multiuser information theory.