Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
zihaomu authored Sep 8, 2023
1 parent 544115f commit b8501cc
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion lessons/5-NLP/20-LangModels/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The idea of a neural network being able to do general tasks without downstream t

> Understanding and being able to produce text also entails knowing something about the world around us. People also learn by reading to the large extent, and GPT network is similar in this respect.
Text generation networks wor;k by predicting probability of the next word $$P(w_N)$$ However, unconditional probability of the next word equals to the frequency of the this word in the text corpus. GPT is able to give us **conditional probability** of the next word, given the previous ones: $$P(w_N | w_{n-1}, ..., w_0)$$
Text generation networks work by predicting probability of the next word $$P(w_N)$$ However, unconditional probability of the next word equals to the frequency of the this word in the text corpus. GPT is able to give us **conditional probability** of the next word, given the previous ones: $$P(w_N | w_{n-1}, ..., w_0)$$

> You can read more about probabilities in our [Data Science for Beginers Curriculum](https://github.com/microsoft/Data-Science-For-Beginners/tree/main/1-Introduction/04-stats-and-probability)
Expand Down

0 comments on commit b8501cc

Please sign in to comment.