Information Theory (TD502)

The aim of the course is to familiarize students with the principles, concepts and applications of information theory. Information Theory is the fundamental field of study of signal transmission and compression associated with quantifying data to as much data as possible to be reliably stored on a medium or transmitted through a communication channel. The information measure, also known as entropy information, is usually expressed by the average number of bits required for storage or communication.


Upon successful completion of the course, students will be able to: Knowledge level: 1. Understand the basic definitions and the concepts of probabilities. 2. Describe the concepts of entropy, information and redundancy. 3. Study discrete and continuous information sources with memory and without memory. 4. Describe the Shannon, Huffman, Fano, Shannon-Fano-Elias and Lempel-Ziv coding algorithms. 5. Describe the concept of channel capacity without noise and with AWG noise. 6. Describe block codes, bar codes and convolutional channel codes. 7. Describe decoding a loose decision. 8. Describe the non-loss-coding zip, bzip, pkzip, gzip, 7zip 9. Describe the loss-coding JPEG, MPEG, H.26X coding standards Skills level 1. Count the entropy of sources with memory and without memory. 2. Apply Shannon, Huffman, Fano, Shannon-Fano-Elias and Lempel-Ziv coding algorithms to specific problems. 3. Evaluate the effects of noise in the channel. 4. Evaluate waveform encoding algorithms 5. Calculate block codes for given problem. 6. Compare the structural differences between error detection and correction codes. 7. Compare the linear and circular codes. 8. Calculate convolutive codes for a given problem. At Skills level 1. Apply Shannon, Huffman, Fano, Shannon-Fano-Elias and Lempel-Ziv encoding algorithms to specific problems. 2. Compare and evaluate source-coded methods without memory and memory. 3. Design and evaluate block codes for a given problem. 4. Apply the Viterbi algorithm to a given problem 5. Design block codes, interleaving codes and Reed-Solomon codes. 6. Design Trellis diagrams 7. Design combined source, channel, and configuration coding systems as a whole.


Probability theory


Probability theory, information entropy, Shannon theorem, discrete sources with and without memory, discrete channels, lossless data compression techniques (theory and Huffman algorithms, Shannon and arithmetic coding), signals and noise, discrete and continuous channels, coding and channel capacity, source-channel separation, lossy compression and quantization, rate-loss function (rate-distortion function), detection coding and error correction (barcodes, BCI codes, cyclic codes, Reed-Solomon), Technique ARQ-Interleaving.






Instructors: Michael Paraskevas
Department: Computer and Informatics Engineering Department
Institution: TEI of Western Greece
Subject: Electrical Engineering, Electronic Engineering, Information Engineering
Rights: CC - Attribution-NonCommercial-ShareAlike

Visit Course Page