Skip to content

Commit 0ce1d28

Browse files
author
Vineet John
committed
Additional points added
2 parents 3fd0158 + aa6cec9 commit 0ce1d28

File tree

1 file changed

+9
-4
lines changed

1 file changed

+9
-4
lines changed

project-report/cs698_project_report.tex

Lines changed: 9 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -78,9 +78,6 @@ \section{Goal} % (fold)
7878
% section goal (end)
7979

8080

81-
\newpage
82-
83-
8481
\section{A Primer of Neural Net Models for NLP} % (fold)
8582
\label{sec:a_primer_of_neural_net_models_for_nlp}
8683

@@ -111,6 +108,7 @@ \section{A Primer of Neural Net Models for NLP} % (fold)
111108
\section{A Neural Probabilistic Language Model} % (fold)
112109
\label{sec:a_neural_probabilistic_language_model}
113110

111+
\textbf{Background:}
114112
\begin{itemize}
115113
\item
116114
We propose to fight the curse of dimensionality by learning a distributed representation for words which allows each training sentence to inform the model about an exponential number of semantically neighboring sentences.
@@ -127,9 +125,16 @@ \section{A Neural Probabilistic Language Model} % (fold)
127125
\item
128126
This will ensure that semantically similar words end up with an almost equivalent feature vectors, called learned distributed feature vectors.
129127
\item
130-
128+
A challenge with modeling discrete variables like a sentence structure as opposed to a continuous value is that the continuous valued function can be assumed to have some form of locality, but the same assumption cannot be made in case of discrete functions.
129+
\item
130+
N-gram models try to acheive a statistical modeling of languages by calculating the conditional probabilities of each possible word that can follow a set of $n$ preceding words.
131+
\item
132+
New sequences of words can be generated by effectively gluing together the popular combinations i.e. n-grams with very high frequency counts.
131133
\end{itemize}
132134

135+
\textbf{Goal of the paper:}
136+
Knowing the basic structure of a sentence, we should be able to create a new sentence by replacing parts of the old sentence with interchangeable elements.
137+
133138
% section a_neural_probabilistic_language_model (end)
134139

135140

0 commit comments

Comments
 (0)