You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Contextualized Embeddings are embeddings such as in Transformers.
Contextualized Embeddings are able to generate different vector representations for the different meanings a single word can have (this is called polysemy). For example the word "bank" has a lot of different meanings (such as financial organization or the bank of a river etc. etc.). All these different menaings of "bank" get totally different vectors in Contextualized Embeddings.
In NON-Contextualized Embeddings the word bank is represented by a single vector in embedding space, which is of course a very bad representation !
Therefore my question:
Are these embeddings Contextualized Embeddings ?
The text was updated successfully, but these errors were encountered:
Hi, Thanks a lot for your interest in the INSTRUCTOR!
I guess we need to provide context in order to get contextualized embeddings. For example, if we are only provided with a single word "bank", I think it is hard to guess the meaning it refers to.
Are these embeddings Contextualized Embeddings ?
Contextualized Embeddings are embeddings such as in Transformers.
Contextualized Embeddings are able to generate different vector representations for the different meanings a single word can have (this is called polysemy). For example the word "bank" has a lot of different meanings (such as financial organization or the bank of a river etc. etc.). All these different menaings of "bank" get totally different vectors in Contextualized Embeddings.
In NON-Contextualized Embeddings the word bank is represented by a single vector in embedding space, which is of course a very bad representation !
Therefore my question:
Are these embeddings Contextualized Embeddings ?
The text was updated successfully, but these errors were encountered: