-
-
Notifications
You must be signed in to change notification settings - Fork 87
Issues: explosion/spacy-llm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Many returns are not what I want
usage
How to use `spacy-llm`
#443
opened Feb 18, 2024 by
tianchiguaixia
How to surpass BERT through large models
usage
How to use `spacy-llm`
#442
opened Feb 17, 2024 by
tianchiguaixia
Working dummy example for custom LLM endpoint integration
usage
How to use `spacy-llm`
#436
opened Jan 29, 2024 by
borhenryk
[Warning] the current text generation call will exceed the model's predefined maximum length (4096).
feat/model
Feature: models
usage
How to use `spacy-llm`
#423
opened Jan 21, 2024 by
yileitu
'<' not supported between instances of 'str' and 'int'
usage
How to use `spacy-llm`
#411
opened Jan 8, 2024 by
BaptisteLoquette
Inconsistent output on Dolly NER
feat/model
Feature: models
usage
How to use `spacy-llm`
#393
opened Dec 1, 2023 by
nxitik
ProTip!
Find all open issues with in progress development work with linked:pr.