Closed
Description
❓ Questions and Help
Description
Hi, we can use glove embedding when building vocab, using
something like:
MIN_FREQ = 2
TEXT.build_vocab(train_data,
min_freq = MIN_FREQ,
vectors = "glove.6B.300d",
unk_init = torch.Tensor.normal_)
We also can create embeddings using flair library, using for example:
embedding_types: List[TokenEmbeddings] = [
WordEmbeddings('glove'),
# comment in this line to use character embeddings
#CharacterEmbeddings(),
# comment in these lines to use flair embeddings
FlairEmbeddings('news-forward'),
FlairEmbeddings('news-backward'),
ELMoEmbeddings(),
BertEmbeddings('bert-base-uncased'),
]
embeddings: StackedEmbeddings = StackedEmbeddings(embeddings=embedding_types)
Could I use the above embeddings instead of glove in the above code?
Is anything similar to this supported?
Metadata
Metadata
Assignees
Labels
No labels