Unit name | Information Theory 3 |
---|---|
Unit code | MATH34600 |
Credit points | 10 |
Level of study | H/6 |
Teaching block(s) |
Teaching Block 2C (weeks 13 - 18) |
Unit director | Dr. Wiesner |
Open unit status | Not open |
Pre-requisites |
Probability 1 (Statistics 1 ishelpful, but not neccessary) |
Co-requisites |
None |
School/department | School of Mathematics |
Faculty | Faculty of Science |
Shannon's information theory is one of the great intellectual achievements of the 20th Century, which half a century after its conception continues to inspire communications engineering and to generate challenging mathematical problems. Recently it has extended dramatically into physics as quantum information theory.
Aims
To give a rigorous and modern introduction into Shannon's theory of information, with emphasis on fundamental concepts and mathematical techniques.
Syllabus
Relation to Other Units
The probabilistic nature of the problems considered and of the mathematical modellings in information theory relates this unit to the probability and statistics units in levels 1, 2 and 3. It is very much suited as a companion to the Quantum Computation and Information unit.
Related courses in Computer Science: Computational Complexity, Cryptography and Data Security. Furthermore, in Electrical Engineering: Communications. And in Physics: Thermal Physics and Statistical Physics.
This unit should enable students to:
Transferable skills:
Mathematical - Knowledge of basic information theory; probabilistic reasoning.
General skills - Modelling, problem solving and logical analysis Assimilation and use of complex and novel ideas
Lectures. Exercises to be done by students, problem classes.
The final assessment mark for Information Theory is calculated from a 1½-hour written examination consisting of THREE questions. A candidate's best TWO answers will be used for assessment. Calculators of an approved type (non-programmable, no text facility) are allowed.
There exist many textbooks on the elements of information theory. The course will follow such treatises only in the first half, and in the second presenting more modern material. Nevertheless, as a rigorous and affordable companion for the student, I can recommend
1. R B Ash. Information Theory, Dover Publications, 1990 My favourite is the - rather encyclopedic - well-written and much up-to-date book:
2. T M Cover & J A Thomas. Elements of Information Theory, Wiley Interscience, 1991. Other useful references are:
3. C E Shannon & W Weaver. The Mathematical Theory of Communication, University of Illinois Press, 1963.
4.I Csiszar & J Koerner. Information Theory: Coding Theorems for Discrete Memoryless Systems (2nd ed.), Akademiai Klado, Budapest, 1997. The course only requires elementary probability theory, but students who have taken further probability will find some of the course content easier. A very good reference is
5.G R Grimmett & D Welsh. Probability: An Introduction, Oxford University Press, 1986.
Additional reading for the probabilistic method:
N Alon & J H Spencer. The Probabilistic Method (2nd ed.), Wiley Interscience, 2000.