Kullback–Leibler divergence is a measure of how one probability distribution differs from another. It is a nonnegative function that can be used to characterize relative entropy in information systems and randomness in continuous time-series. It has diverse applications in theoretical and practical fields such as applied statistics, fluid mechanics, neuroscience and bioinformatics.
Stanford University
Autumn 2022
The course addresses both classic and recent developments in counting and sampling. It covers counting complexity, exact counting via determinants, sampling via Markov chains, and high-dimensional expanders.
No concepts data
+ 52 more concepts