| 00:00:00 | Recurrent Neural Networks |
| 00:00:23 | The problem with bag-of-words techniques |
| 00:02:28 | Using recurrence to process text as a sequence |
| 00:07:53 | Backpropagation with RNNs |
| 00:12:03 | RNNs vs other sequence processing techniques |
| 00:13:08 | Introducing Language Models |
| 00:14:37 | Training RNN-based language models |
| 00:17:40 | Text generation with RNN-based language models |
| 00:19:44 | Evaluating language models with Perplexity |
| 00:20:54 | The shortcomings of simple RNNs |
| 00:22:48 | Capturing long-range dependencies with LSTMs |
| 00:27:20 | Multilayer and bidirectional RNNs |
| 00:29:58 | DEMO: Building a Part-of-Speech Tagger with a bidirectional LSTM |
| 00:42:22 | DEMO: Building a language model with a stacked LSTM |
| 00:58:04 | Different RNN setups |