Skip to content

mei-pan/Natural_Language_Processing_Specialization

Repository files navigation

Image description

Natural Language Processing Specialization Link.

This specialization is a series of 4 courses offered by Deeplearning.ai focuses on cutting-edge NLP techniques.

Course 1: NLP with Classification and Vector Spaces Link

Objective: Using logistic regression, naïve Bayes and word vector to implement sentiment analysis, complete analogies and word translation.

Week 1:

Learn to extract features from text into numerical vectors, then build a sentiment analysis model for tweets using a logistic regression!

     Assignment: Logistic Regression Link.
        - Sentiment analysis of tweets with Logistc Regression.

Week 2:

Learn the theory behind Bayes' rule for conditional probabilities, then apply it toward building a Naive Bayes model for sentiment analysis      Assignment: Naive Bayes Link
       - Sentiment analysis of tweets with Naive Bayes model.

Week 3:

understand vectors spaces, ways to ultilize euclidean distance and cosine similiarity to find similar and dissimilar items with embeddings and PCA.
     Assignment: Vector Space Model & PCA Link.
       - Ultilize vector space and vector embeddings to build a function that predict the name of capital the city given the country name.
       - project also include building a functions that computes PCA.

Week 4:

Learn to transform word vectors and assign them to subsets using locality sensitive hashing, in order to perform machine translation and document search.
     Assignment: Machine Translation and LSH (Local Sensitive Hashing) Link

Course 2: Natural Language Processing with Probabilstic Models

Week 1:

Learn about autocorrect, minimum edit distance, and dynamic programming, then build a spellchecker to correct misspelled words!
     Assignment: Autocorrection Link.

Week 2:

Learn about Markov and Hidden Markov Models as well as Viterbi Algorithm to create a part-of-speech(POS) tags for a Wall Street Journal text corpus.
     Assignment: POS Tagging Link.

Week 3:

Learn about how N-gram language models work by calculating sequence probabilities, then build an autocomplete language model using a text corpus from Twitter!
     Assignment: Autocomplete Link.

Week 4:

Learn methods to built word embedding from scarch.
    Assignment: Word Embeddings Link.

Course 3: Natural Language Processing in TensorFlow

Course 4: Sequences, Time Series and Prediction

Releases

No releases published

Packages

No packages published