Error Decomposition

Bias%E2%80%93variance tradeoff

The bias–variance tradeoff is a concept in statistics and machine learning which states that reducing the variance of a model's parameters can increase its bias. This creates a dilemma when trying to minimize both sources of error, as they are often conflicting. The bias–variance decomposition is a way of analyzing this tradeoff by breaking down the expected generalization error into three terms.

1 courses cover this concept

COS 324 - Introduction to Machine Learning

Princeton University

Fall 2017

A thorough introduction to machine learning principles such as online learning, decision making, gradient-based learning, and empirical risk minimization. It also explores regression, classification, dimensionality reduction, ensemble methods, neural networks, and deep learning. The course material is self-contained and based on freely available resources.

No concepts data

+ 14 more concepts