Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Are these embeddings Contextualized Embeddings ? #103

Open
jbdatascience opened this issue Dec 26, 2023 · 1 comment
Open

Are these embeddings Contextualized Embeddings ? #103

jbdatascience opened this issue Dec 26, 2023 · 1 comment

Comments

@jbdatascience
Copy link

Are these embeddings Contextualized Embeddings ?

Contextualized Embeddings are embeddings such as in Transformers.
Contextualized Embeddings are able to generate different vector representations for the different meanings a single word can have (this is called polysemy). For example the word "bank" has a lot of different meanings (such as financial organization or the bank of a river etc. etc.). All these different menaings of "bank" get totally different vectors in Contextualized Embeddings.
In NON-Contextualized Embeddings the word bank is represented by a single vector in embedding space, which is of course a very bad representation !

Therefore my question:
Are these embeddings Contextualized Embeddings ?

@hongjin-su
Copy link
Collaborator

Hi, Thanks a lot for your interest in the INSTRUCTOR!

I guess we need to provide context in order to get contextualized embeddings. For example, if we are only provided with a single word "bank", I think it is hard to guess the meaning it refers to.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants