IA170 Randomness and Communication

Fakulta informatiky
podzim 2015
Rozsah
2/1. 3 kr. (plus ukončení). Ukončení: zk.
Vyučující
doc. RNDr. Jan Bouda, Ph.D. (přednášející)
Frédéric Dupont Dupuis, Ph.D. (cvičící)
Garance
prof. RNDr. Mojmír Křetínský, CSc.
Katedra teorie programování – Fakulta informatiky
Kontaktní osoba: doc. RNDr. Jan Bouda, Ph.D.
Dodavatelské pracoviště: Katedra teorie programování – Fakulta informatiky
Rozvrh
Út 8:00–9:50 C511
  • Rozvrh seminárních/paralelních skupin:
IA170/01: každé liché úterý 10:00–11:50 C416, J. Bouda
IA170/02: každé sudé úterý 10:00–11:50 C416, J. Bouda
Předpoklady
Knowledge of basic discrete mathematics (e.g. as presented in the course IB000), basic knowledge of probability theory (e.g. IV111).
Omezení zápisu do předmětu
Předmět je nabízen i studentům mimo mateřské obory.
Mateřské obory/plány
Cíle předmětu
At the end of the course student should have a sufficient knowledge and skills to independently study and solve problems in the areas of: randomness generation, randomness post processing, efficient usage of the real world randomness in information processing, efficient communication both with and without noise in the channel, efficient distributed function computation. He should acquire essential knowledge of information theory as well. Should be able to learn independently new problems requiring knowledge of information theory, randomness and communication complexity.
Osnova
  • - Operational meaning of information and entropy. Shannon entropy, relative entropy, mutual information. Rényi entropy. Min- and Max- entropy.
  • - Shannon (asymptotic) channel capacity. Assisted channel capacity, channels with memory. Shannon’s forward and reverse channel capacity theorem. Shannon’s zero-error capacity, Lovasz‘s theta function.
  • - Perfect and weak randomness, applications. Measuring randomness. Randomness extraction, extractors, seeded extractors, independent-source extractors. Pseudorandom structures: k-wise independent random variables, expanders.
  • - Communication complexity. Communication-efficient distributed computation. Randomness and communication complexity.
Literatura
    doporučená literatura
  • - KUSHILEVITZ, Eyal a Noam NISAN. Communication complexity. Cambridge: Cambridge University Press, c1997, xiii, 189 s. ISBN 0521560675
  • CSISZÁR, Imre a János KÖRNER. Information theory : coding theorems for discrete memoryless systems. 2nd ed. Cambridge: Cambridge University Press, 2011, xxi, 499. ISBN 9780521196819. info
  • STINSON, Douglas Robert. Cryptography : theory and practice. 3rd ed. Boca Raton: CRC Press, 2006, 593 s. ISBN 1584885084. info
  • COVER, T. M. a Joy A. THOMAS. Elements of information theory. 2nd ed. Hoboken, N.J.: Wiley-Interscience, 2006, xxiii, 748. ISBN 0471241954. info
  • MITZENMACHER, Michael a Eli UPFAL. Probability and computing : an introduction to randomized algorithms and probabilistic analysis. New York: Cambridge University Press, 2005, xvi, 352. ISBN 0521835402. info
Výukové metody
Theoretical lectures and practical examples in tutorials.
Metody hodnocení
Combination of a written test and an oral exam. Student successful in the written test has to pass an oral exam in order to achieve grade C or better.
Vyučovací jazyk
Angličtina
Informace učitele
The main aim of the course is to provide students with practical understanding and basic theoretical tools of randomness and efficient communication. The main accent of this course is not on the theoretical concepts, but on their practical meaning and applications. To illustrate this approach, consider the information theory. In a standard course students learn many well known information measures, such as Shannon entropy or Shannon's mutual information. While these quantities are very important, a very common misconception is to understand them as universal measures for uncertainty, or information.
In this course we put accent on operational meaning, including subtle differences. Do we want to compress a block of data produced by an iid source? Shannon's entropy measures the optimal result. What if the probability distribution of the source is different than we assume? Relative entropy measures our inefficiency. How many uniform bits can we extract from a non-uniform source? Min-entropy is the answer.
The course is divided into three main areas. Information theory - compression of information, efficient communication, protection against communication or storage noise. Randomness - definitions, applications, theoretical versus real-world randomness, improving quality of randomness, pseudorandomness - randomness substitute. Communication complexity - minimizing communication of distributed computational problems, applications of randomness and entanglement.
The outline of the course:
Lecture 1 Review of the course structure and content. Introduction to Shannon's information theory. Applications of Shannon information theory - data compression (entropy), relative entropy (inefficiency of compression with unknown distribution, hypothesis testing), mutual information (noisy channel coding), Shannon's noisy coding theorem
Lecture 2 Introduction to Shannon's information theory. Applications of Shannon information theory - data compression (entropy), relative entropy (inefficiency of compression with unknown distribution, hypothesis testing), mutual information (noisy channel coding), Shannon's noisy coding theorem
Lecture 3 Converse to Shannon's noisy theorem, assisted capacities - randomness, backward channels, channels with memory
Lecture 4 Converse to Shannon's noisy theorem, assisted capacities - randomness, backward channels, channels with memory
Lecture 5 Shannon's zero-error capacity, Lovasz's theta function
Lecture 6 Dealing with non-iid sources, compression
Lecture 7 The notion of randomness (probability, unpredictability, Kolmogorov, Chaitin), motivation, possible sources of randomness
Lecture 8 High quality randomness, applications of randomness (motivation for the adversarial model), weak randomness, von Neumann extractor
Lecture 9 Min-entropy, Rényi entropies (Shannon and min-entropy are special cases of Rényi entropies). Min-entropy sources, randomness (seeded) extractors, two-source extractors, condensers
Lecture 10 Min-entropy sources, randomness (seeded) extractors, two-source extractors, condensers
Lecture 11 Pseudorandomness. Historical pseudo-random number generators (congruential generators). K-wise independent random variables. Expanders. Provable pseudorandom number generators.
Lecture 12 Introduction and motivation of communication complexity, comparison of problems
Lecture 13 Assisted communication complexity - randomness, entanglement
Lecture 14 Efficient randomness-assisted noisy X-mas distributed computation.
Další komentáře
Studijní materiály
Předmět je vyučován každoročně.

  • Statistika zápisu (nejnovější)
  • Permalink: https://is.muni.cz/predmet/fi/podzim2015/IA170