Word Vectors

TimestampDescription
00:00:00Word Vectors
00:00:37One-Hot Encoding and its shortcomings
00:02:07What embeddings are and why they're useful
00:05:12Similar words share similar contexts
00:06:15Word2Vec, a way to automatically create word embeddings
00:08:08Skip-Gram With Negative Sampling (SGNS)
00:17:11Three ways to use word vectors in models
00:18:48DEMO: Training and using word vectors
00:41:29The weaknesses of static word embeddings

References and Links