Recurrent Neural Networks

TimestampDescription
00:00:00Recurrent Neural Networks
00:00:23The problem with bag-of-words techniques
00:02:28Using recurrence to process text as a sequence
00:07:53Backpropagation with RNNs
00:12:03RNNs vs other sequence processing techniques
00:13:08Introducing Language Models
00:14:37Training RNN-based language models
00:17:40Text generation with RNN-based language models
00:19:44Evaluating language models with Perplexity
00:20:54The shortcomings of simple RNNs
00:22:48Capturing long-range dependencies with LSTMs
00:27:20Multilayer and bidirectional RNNs
00:29:58DEMO: Building a Part-of-Speech Tagger with a bidirectional LSTM
00:42:22DEMO: Building a language model with a stacked LSTM
00:58:04Different RNN setups

References and Links