Introduction: entropy and mutual information theory: joint entropy, conditional entropy, relationship between entropy and mutual information, chain rules for entropy, relative entropy, mutual information, jensen’s inequality fano’s inequality.
An introduction to codes: coding: kraft inequality, optimal codes, bounds on optimal code length, kraft inequality for uniquely decodable codes, shannon and huffman codes, shannon, fano, elias codes, block codes, linear block codes, cyclic codes
Efficient encoding, information sources; average code word length; huffman encoding; noiseless coding: the noiseless coding theorem
Channel capacity: discrete memoryless channels and capacity, examples of channel capacity, symmetric channels, properties of channel capacity, channel coding theorem
Theory and practice of error-control coding: trellis diagram and the viterbi algorithm, convolution coding in mobile communications and modern graph-based codes (turbo-codes and ldpc codes), the main coding theory problem.
Prerequisites: None
Text Books:
- T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed. Wiley-Interscience, 2006. ISBN-13: 978-0471241959.
- S. Lin and D. J. Costello, Error Control Coding, 2nd ed. Pearson Prentice Hall, 2004, ISBN-13: 978-0130426727.
Reference Books:
- R. G. Gallager, Information Theory and Reliable Communication. Wiley, 1968, ISBN-13: 978-0471290483
- I Csiszar and J. Korner, Information Theory: Coding Theorems for Discrete Memoryless Sys-tems. Akademiai Kiado, December 1981, ISBN-13: 978-9630574402.
- T. S. Han, Information-Spectrum Methods in Information Theory. Springer, 2002, ISBN-13: 978-3642078125.
- Andre Neubauer, Jurgen Freedenberg, Volker Kuhn, “Coding theory Algorithm, Architectures and Applications”,Willey India Editions, ISBN: 978-81-265-3432-6, 2007
- Ranjan Bose, “Information theory, Coding and Cryptography”, TMH publication, ISBN: 978-0-07-0669017, 2008
- Roman, Steven, “Introduction to Coding and Information Theory”, Springer, ISBN 978-0-387-94704-4 Journal readings
|