Bias - Variance

Bias%E2%80%93variance tradeoff

The bias–variance tradeoff is a concept in statistics and machine learning which states that reducing the variance of a model's parameters can increase its bias. This creates a dilemma when trying to minimize both sources of error, as they are often conflicting. The bias–variance decomposition is a way of analyzing this tradeoff by breaking down the expected generalization error into three terms.

1 courses cover this concept

CS 229: Machine Learning

Stanford University

Winter 2023

This comprehensive course covers various machine learning principles from supervised, unsupervised to reinforcement learning. Topics also touch on neural networks, support vector machines, bias-variance tradeoffs, and many real-world applications. It requires a background in computer science, probability, multivariable calculus, and linear algebra.

No concepts data

+ 32 more concepts