[Day 57] Stanford CS224N - Lecture 1. Word vectors
Hello :) Today is Day 57! Quick summary of today: Got introduced to word vectors with lecture 1 + small showcase Read the paper and took notes about: Efficient Estimation of Word Representations in Vector Space (word2vec) Distributed Representations of Words and Phrases and their Compositionality (negative sampling) First I watched the lecture by Professor Manning, and then I read the papers so some of the material was overlapping, but there were still interesting new parts in each of the three. 1) Lecture notes My notes from the lecture were not that long so I will first share them. After the lecture, there was a small colab document I could run through. In the lecture, we mention about matrix multiplication and the larger the dot product between the 2 words, the more similar they are -> exp(u0T @ vc) So in the colab there was a model loaded. And I searched for the word embeddings of bread and baguette which in my view are similar words. In the below pi...