Entropy is defined as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. It is strongly connected with probability distributions and the principle of maximum entropy can be very useful in statistical inference, in particular in Bayes statistics. In this course we will introduce the concept of entropy in the context of information theory as well as apply the concepts on real data sets.
Syllabus
- Introduction and Preview
- Entropy, Relative Entropy, and Mutual Information
- Asymptotic Equipartition Property
- Entropy Rates of a Stochastic Process
- Differential Entropy
- Information Theory and Statistics
Target audience:
- Advanced Bachelor Statistics (e.g. "Ausgewählte Gebiete der angewandten Statistik B", 3 ECTS)
- Master Statistics, Biostatistics, Statistics (WiSO), (e.g. "Ausgewählte Gebiete der ... Statistik B", 3 ECTS)
- ESG Data Science (3 ECTS Elective Course)
Date and Time: April 16th to May 28th (!), Friday 2pm to 6pm
Exam: Midterm & Final Exams (Written Exams)
The course will be held online in English.
Code for course enrollment: Information
- Trainer/in: Zahra Aminifarsani
- Trainer/in: Volker Schmid