Empirical risk minimization

Empirical risk minimization

Empirical risk minimization is a principle in statistical learning theory which defines a family of learning algorithms. It measures the performance of an algorithm on a known set of training data, rather than the true distribution of data. This allows us to estimate the true risk of the algorithm.

2 courses cover this concept

CS 168: The Modern Algorithmic Toolbox

Stanford University

Spring 2022

CS 168 provides a comprehensive introduction to modern algorithm concepts, covering hashing, dimension reduction, programming, gradient descent, and regression. It emphasizes both theoretical understanding and practical application, with each topic complemented by a mini-project. It's suitable for those who have taken CS107 and CS161.

No concepts data

+ 57 more concepts

11-785 Introduction to Deep Learning

Carnegie Mellon University

Spring 2020

This course provides a comprehensive introduction to deep learning, starting from foundational concepts and moving towards complex topics such as sequence-to-sequence models. Students gain hands-on experience with PyTorch and can fine-tune models through practical assignments. A basic understanding of calculus, linear algebra, and Python programming is required.

No concepts data

+ 40 more concepts