Gated recurrent unit (GRU)

Gated recurrent unit

Gated recurrent units (GRUs) are a type of gating mechanism in recurrent neural networks, introduced in 2014. They are similar to long short-term memory (LSTM) units but have fewer parameters because they lack an output gate. GRUs have shown comparable performance to LSTM on tasks like music modeling, speech signal modeling, and natural language processing, with no definitive conclusion on which is superior.

2 courses cover this concept

CS231n: Deep Learning for Computer Vision

Stanford University

Spring 2022

This is a deep-dive into the details of deep learning architectures for visual recognition tasks. The course provides students with the ability to implement, train their own neural networks and understand state-of-the-art computer vision research. It requires Python proficiency and familiarity with calculus, linear algebra, probability, and statistics.

No concepts data

+ 55 more concepts

CSCI 1470/2470 Deep Learning

Brown University

Spring 2022

Brown University's Deep Learning course acquaints students with the transformative capabilities of deep neural networks in computer vision, NLP, and reinforcement learning. Using the TensorFlow framework, topics like CNNs, RNNs, deepfakes, and reinforcement learning are addressed, with an emphasis on ethical applications and potential societal impacts.

No concepts data

+ 40 more concepts