Whether you’re a student, a researcher, or a practitioner, I hope that my detailed, in-depth explanation will give you the real understanding and knowledge that you’re looking for.GET STARTED
Python Notebooks (hosted on Google Colab) implement key portions of the algorithm from scratch to further illustrate the concepts.
In the Chapter 1 Notebook we'll play around with a pre-trained word model to look at its vocabulary and to try out some of the basic operations commonly performed on word vectors.
In the Chapter 2 Notebook, we'll reinforce our understanding of the skip-gram neural network architecture by implementing a forward pass from scratch.
In the Chapter 5 Notebook we'll get hands on with backpropagation and implement the weight updates (for a skip-gram model with negative sampling) from scratch!
In the Chapter 6 Notebook, we will train a word2vec model with subword information (using the ‘fasttext’ model in gensim) on the Wikipedia Attack Comments dataset. We'll look at how the training time and memory requirements compare, as well as the quality of the resulting vectors.
Appendix - Complete word2vec Training Example.ipynb
This Notebook goes through the full process of training a word2vec model using the gensim library. You can use this as a starting point for training your own model on your own dataset.
The Example Code is sold separately from the eBook--check the box to add it to your order during checkout.