Each course is spread out in weeks, and are made up of video slides, lab sessions, quizzes, assignments, related course materials, code and data
- Preprocessing
- Building and Visualizing word frequencies
- Visualizing tweets and the Logistic Regression model
- NLP Course 2 Week 1 Lesson : Building The Model - Lecture Exercise 01
- NLP Course 2 Week 1 Lesson : Building The Model - Lecture Exercise 02
- Parts-of-Speech Tagging - First Steps: Working with text files, Creating a Vocabulary and Handling Unknown Words
- Parts-of-Speech Tagging - Working with tags and Numpy
- Word Embeddings First Steps: Data Preparation
- Word Embeddings: Intro to CBOW model, activation functions and working with Numpy
- Word Embeddings: Training the CBOW model
- Word Embeddings: Hands On
- Word Embeddings: Ungraded Practice Notebook
- Hidden State Activation: Ungraded Lecture Notebook
- Vanilla RNNs, GRUs and the
scan
function - Working with JAX numpy and calculating perplexity: Ungraded Lecture Notebook
- Creating a GRU model using Trax: Ungraded Lecture Notebook
- Creating a Siamese model using Trax: Ungraded Lecture Notebook
- Modified Triplet Loss: Ungraded Lecture Notebook
- Evaluate a Siamese model: Ungraded Lecture Notebook
- Basic Attention Operation: Ungraded Lab
- Scaled Dot-Product Attention: Ungraded Lab
- Calculating the Bilingual Evaluation Understudy (BLEU) score: Ungraded Lab
- Stack Semantics in Trax: Ungraded Lab
- The Three Ways of Attention and Dot Product Attention: Ungraded Lab Notebook
- The Transformer Decoder: Ungraded Lab Notebook
The solutions presented are intended to serve as reference for other learners who enroll in this course.