The method of moments is a statistical estimation technique which uses the expected values of powers of a random variable to estimate population parameters. It was introduced by Pafnuty Chebyshev in 1887 and has been used since at least Pearson's time. It involves setting the population moments equal to the sample moments and solving for the parameters of interest.
University of Washington
Winter 2022
This course dives deep into the role of probability in the realm of computer science, exploring applications such as algorithms, systems, data analysis, machine learning, and more. Prerequisites include CSE 311, MATH 126, and a grasp of calculus, linear algebra, set theory, and basic proof techniques. Concepts covered range from discrete probability to hypothesis testing and bootstrapping.
No concepts data
+ 41 more concepts