Teoria informacji i teoria kodowania (Information and Coding Theory)

Język wykładowy Angielski
Semestr Zimowy
Status Poddana pod głosowanie
Opiekun Artur Jeż
Liczba godzin 30 (wyk.) 30 (ćw.)
Rodzaj I2.T - teoria inf.
ECTS 6
Polecany dla I roku Nie
Egzamin Tak
Tagi AZ (algorytmika i złożoność obliczeniowa) PD (przetwarzanie danych)

Opis przedmiotu:

Information theory studies problems such as: * what information is and how to measure it * how information changes during processing * how to measure information in continuous data * limitations on data processing * limitations of communication. Information theory arose as an attempt to answer questions like: * what is the optimal compressor for a given type of data? * how fast can we transmit information through a communication channel? * how to construct lossy compression, in particular for continuous data? These investigations quickly led to much more general considerations, such as: * how to measure the quality of a compressor? * how to measure the transmission speed of data through a communication channel? The approach giiven by information theory differs from that of ``worst-case analysis'': it focuses much more on the analysis of random processes and the processing of such processes. In particular, some of its methods and tools stem from probability theory and statistics. This approach turns out to be effective in practice—the tools and concepts developed by information theory are widely used in analyzing practically implemented solutions, such as: lossless compression, lossy compression, error correction, signal processing… On the other hand, the methods of information theory propose as an alternative perspective on the foundations of statistics and also turn out to be surprisingly effective as measures used in the context of artificial intelligence. This course is independent of (and disjoint from) the course *Data Compression*. Compared to *Data Compression*, this course is much more focused on theoretical aspects. For example, while *Data Compression* presents a variety of compression algorithms and provides intuitions as to why they work, here we will define what it means for a compressor to be effective and prove theorems about the effectiveness of particular compressors. The course has little overlap with the course *Error-Correcting Codes*; removing it is difficult under the current system of choosing courses. ### Topics: * Measures of information: entropy, divergence, mutual information, Markov chains, data processing inequality * Lossless compression: variable-length codes, ergodic sources, universal coding * Communication channel capacity: Shannon’s theorem. Error-correcting codes in the probabilistic model. Polar codes. * Continuous distributions—information measures (differential entropy), encoding in continuous signals * Definition and examples of universal compression: LZ77, grammar-based compression * Lossy compression: quantization, audio and image compression, Karhunen–Loève transform, discrete cosine transform (DCT) * Kolmogorov complexity, relations with entropy. ### Prerequisites: Probability theory (elements of statistics welcome), Mathematical analysis, Algebra (only linear algebra is required). ### Literature: 1. Thomas M. Cover, Joy A. Thomas, Elements of Information Theory. Wiley Series in Telecommunications, 1991. 2. Gareth A. Jones, Mary J. Jones, Information and Coding Theory. Springer, 2000. 3. David Salomon, Data Compression: The Complete Reference, Springer, 1998 4. Raymond W. Yeung, A First Course in Information Theory, Springer 2002