Gated recurrent units (GRUs) are a type of gating mechanism in recurrent neural networks, introduced in 2014. They are similar to long short-term memory (LSTM) units but have fewer parameters because they lack an output gate. GRUs have shown comparable performance to LSTM on tasks like music modeling, speech signal modeling, and natural language processing, with no definitive conclusion on which is superior.
Stanford University
Spring 2022
This is a deep-dive into the details of deep learning architectures for visual recognition tasks. The course provides students with the ability to implement, train their own neural networks and understand state-of-the-art computer vision research. It requires Python proficiency and familiarity with calculus, linear algebra, probability, and statistics.
No concepts data
+ 55 more conceptsBrown University
Spring 2022
Brown University's Deep Learning course acquaints students with the transformative capabilities of deep neural networks in computer vision, NLP, and reinforcement learning. Using the TensorFlow framework, topics like CNNs, RNNs, deepfakes, and reinforcement learning are addressed, with an emphasis on ethical applications and potential societal impacts.
No concepts data
+ 40 more concepts