Lecturer(s)
|
-
Konečný Jan, doc. RNDr. Ph.D.
|
Course content
|
1. Basic notions from probability: probability space, probabiity measure, conditional, joint, and marginal probability, independent events, random variable and related notions. 2. Basic notions of information theory: entropy, interpretations of entropy, basic properties, joint entropy, conditional entropy, 3. Further notions of information theory: divergence and its applications, mutual information, AEP. 4. Introduction to generalized information theory: monotone measures and some of their special cases (imprecise probabilities, possibility theory, Dempster-Shafer theory), uncertainty and information measures for these measures. 5. Selected applications of information theory: Optimal codes as an application of information theory: uniquely decipherable codes, prefix codes, Kraft inequality, McMillan inequality, Shannon theorem on noiseless coding, block coding, Huffman code, its construction and optimality. Decision trees as an application of information theory.
|
Learning activities and teaching methods
|
Dialogic Lecture (Discussion, Dialog, Brainstorming), Work with Text (with Book, Textbook)
|
Learning outcomes
|
The students become familiar with basic and selected advanced concepts of information theory.
1. Knowledge Recognize and understand comprehensively principles and methods of information theory.
|
Prerequisites
|
unspecified
|
Assessment methods and criteria
|
Oral exam
Completing the assignments. Passing the exam.
|
Recommended literature
|
-
Ash R. (1965). Information Theory. Dover, New York.
-
Han T. S., Kobayashi K. Mathematics of Information and Coding. AMS, Providence, Rhode Island.
-
Klir G. J. (2006). Uncertainty and Information. Foundations of Generalized Information Theory. J. Wiley, Hoboken, New Jersey.
-
Pierce J. R. (1980). An Introduction to Information Theory. Symbols, Signals and Noise. Dover, New York.
|