Essentially, all models are wrong, but some are useful. — George E. P. Box

## Course Description

Information theory is a young branch of mathematics created to
study digital data and digital communications. One problem is that of
data compression: how to rewrite a digitally encoded message so that
it occupies less physical media, such as disk space or memory? Another
problem is that of error correction: given a lossy communication
channel how to rewrite a digital message so that the message can be
accurately transmitted with high probability? The answers to both
problems revolve around ``Shannon entropy'', a single number based on
the distribution of symbols in the message, which determines how
efficiently both problems can be solved. This course aims at covering
topics in probability theory, mathematical modeling and algebra of
finite fields. It will be demonstrated how these branches of
mathematics work together in solving problems in compression and error
correction. Every aspect of the course will be illustrated with short
programs written in MATLAB.

The topics covered will include:

- Fundamentals of Data Compression.
- Fundamentals of Error-Correcting Codes.
- Relevant Topics of Applied Probability.
- Relevant Topics in Finite Fields.