Replies: 2 comments
-
This would be nice to have but I'm not sure when I'll have time to look into it. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Is there any work on this? I would love to see this feature, and also help on this |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
`embedding_types = [
TransformerDocumentEmbeddings('bert-base-uncased',fine_tune = True),
TransformerDocumentEmbeddings('roberta-base',fine_tune = True),
TransformerDocumentEmbeddings('microsoft/DialoGPT-small',fine_tune = True),
]
embeddings: StackedEmbeddings = StackedEmbeddings(embeddings=embedding_types)
Stacking` many transformer model at a time . which will also work for sentence pair classification.
Like this : Stack Embedding code line : 50
Beta Was this translation helpful? Give feedback.
All reactions