Refer to the following markdown file for the respective sections of the class:
Learners will understand:
- Deep Learning for NLP
- Common NLP tasks
- Recurrent Neural Networks (RNN)
- Attention Mechanism
- Transformers
Learners will be able to:
- Train an RNN for language modeling
- Fine-tune a pre-trained BERT model for sentiment analysis
- Use a pre-trained BERT model for NER
- Use a pre-trained GPT-2 model for text generation
| Duration | What | How or Why |
|---|---|---|
| - 5mins | Start zoom session | So that learners can join early and start class on time. |
| 20 mins | Activity | Recap on self-study and prework materials. |
| 40 mins | Code-along | Part 1: Deep Learning for NLP and common NLP tasks. |
| 1 HR MARK | ||
| 30 mins | Code-along | Part 2: RNNs- LSTM and GRU. |
| 10 mins | Break | |
| 20 mins | Code-along | Part 3: Attention mechanism. |
| 2 HR MARK | ||
| 50 mins | Code-along | Part 4: Transformers- BERT and GPT. |
| 10 mins | Briefing / Q&A | Brief on references, assignment, quiz and Q&A. |
| END CLASS 3 HR MARK |