Chebyshev‘s inequality

Chebyshev%27s inequality

Chebyshev's inequality is a probability theory that states no more than 1/k2 of the distribution's values can be k or more standard deviations away from the mean. It applies to any probability distribution with defined mean and variance, and is similar to the 68–95–99.7 rule for normal distributions. It is also related to Markov's inequality, which is sometimes referred to as "Chebyshev's First Inequality".

3 courses cover this concept

CSE 312 Foundations of Computing II

University of Washington

Winter 2022

This course dives deep into the role of probability in the realm of computer science, exploring applications such as algorithms, systems, data analysis, machine learning, and more. Prerequisites include CSE 311, MATH 126, and a grasp of calculus, linear algebra, set theory, and basic proof techniques. Concepts covered range from discrete probability to hypothesis testing and bootstrapping.

No concepts data

+ 41 more concepts

CS 168: The Modern Algorithmic Toolbox

Stanford University

Spring 2022

CS 168 provides a comprehensive introduction to modern algorithm concepts, covering hashing, dimension reduction, programming, gradient descent, and regression. It emphasizes both theoretical understanding and practical application, with each topic complemented by a mini-project. It's suitable for those who have taken CS107 and CS161.

No concepts data

+ 57 more concepts

CS 265 / CME 309 Randomized Algorithms and Probabilistic Analysis

Stanford University

Fall 2022

This course dives into the use of randomness in algorithms and data structures, emphasizing the theoretical foundations of probabilistic analysis. Topics range from tail bounds, Markov chains, to randomized algorithms. The concepts are applied to machine learning, networking, and systems. Prerequisites indicate intermediate-level understanding required.

No concepts data

+ 37 more concepts