Skip to content

su-ntu-ctp/5m-data-3.10-nlp-advanced

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

3.10 Natural Language Processing Advanced

Dependencies

Refer to the following markdown file for the respective sections of the class:

Lesson Objectives

Learners will understand:

  • Deep Learning for NLP
  • Common NLP tasks
  • Recurrent Neural Networks (RNN)
  • Attention Mechanism
  • Transformers

Learners will be able to:

  • Train an RNN for language modeling
  • Fine-tune a pre-trained BERT model for sentiment analysis
  • Use a pre-trained BERT model for NER
  • Use a pre-trained GPT-2 model for text generation

Lesson Plan

Duration What How or Why
- 5mins Start zoom session So that learners can join early and start class on time.
20 mins Activity Recap on self-study and prework materials.
40 mins Code-along Part 1: Deep Learning for NLP and common NLP tasks.
1 HR MARK
30 mins Code-along Part 2: RNNs- LSTM and GRU.
10 mins Break
20 mins Code-along Part 3: Attention mechanism.
2 HR MARK
50 mins Code-along Part 4: Transformers- BERT and GPT.
10 mins Briefing / Q&A Brief on references, assignment, quiz and Q&A.
END CLASS 3 HR MARK

About

No description or website provided.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published