Skip to content

Commit

Permalink
Merge pull request microsoft#244 from zihaomu/patch-1
Browse files Browse the repository at this point in the history
Update README.md
  • Loading branch information
BethanyJep committed Oct 27, 2023
2 parents 17afcab + b8501cc commit 4bc6213
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion lessons/5-NLP/20-LangModels/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The idea of a neural network being able to do general tasks without downstream t

> Understanding and being able to produce text also entails knowing something about the world around us. People also learn by reading to the large extent, and GPT network is similar in this respect.
Text generation networks wor;k by predicting probability of the next word $$P(w_N)$$ However, unconditional probability of the next word equals to the frequency of the this word in the text corpus. GPT is able to give us **conditional probability** of the next word, given the previous ones: $$P(w_N | w_{n-1}, ..., w_0)$$
Text generation networks work by predicting probability of the next word $$P(w_N)$$ However, unconditional probability of the next word equals to the frequency of the this word in the text corpus. GPT is able to give us **conditional probability** of the next word, given the previous ones: $$P(w_N | w_{n-1}, ..., w_0)$$

> You can read more about probabilities in our [Data Science for Beginers Curriculum](https://github.com/microsoft/Data-Science-For-Beginners/tree/main/1-Introduction/04-stats-and-probability)
Expand Down

0 comments on commit 4bc6213

Please sign in to comment.