Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Ollama support #465

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

strickvl
Copy link

@strickvl strickvl commented Apr 28, 2024

Added Ollama support

Description

I added support for Ollama which can now be used in conjunction with spacy-llm. I added all the models currently supported as well, but perhaps it's better to let users define those themselves? I thought it nicer to just have a set of pre-baked models available...

Corresponding documentation PR

Docs updates are at explosion/spaCy#13465

Types of change

feature

Quickstart / Example of usage

from spacy_llm.util import assemble

nlp = assemble("/home/strickvl/coding/spacy-llm/test_zone/config.cfg")

doc = nlp("In early April, Amazon announced plans to expand its operations into Delft, Netherlands, aiming to strengthen its technology hub in Europe. The CEO, Andy Jassy, mentioned during a press conference at The Hague that this move would create over 500 new jobs in the region by the end of 2024. Meanwhile, in a related development, Microsoft, under the leadership of Satya Nadella, has launched a new cloud computing service in partnership with the University of Cambridge. This collaboration aims to facilitate advanced research in artificial intelligence and machine learning applications.")

print([(ent.text, ent.label_) for ent in doc.ents])

...and using a config.cfg file...

[nlp]
lang = "en"
pipeline = ["llm"]

[components]

[components.llm]
factory = "llm"

[components.llm.task]
@llm_tasks = "spacy.NER.v3"
labels = ["PERSON", "ORGANISATION", "LOCATION"]

[components.llm.model]
@llm_models = "spacy.Ollama.v1"
name = "notus"

Then make sure you have everything set up to run locally:

ollama serve
ollama pull notus

Perhaps needless to say, you'll need a machine with a GPU to run this / use the Ollama models.

Then run the Python file and you should get something like:

[('Delft, Netherlands', 'LOCATION'), ('Andy Jassy', 'PERSON'), ('Satya Nadella', 'PERSON'), ('Cambridge', 'LOCATION')]

Checklist

  • I confirm that I have the right to submit this contribution under the project's MIT license.
  • I ran all tests in tests and usage_examples/tests, and all new and existing tests passed. This includes
    • all external tests (i. e. pytest ran with --external)
    • all tests requiring a GPU (i. e. pytest ran with --gpu)
  • My changes don't require a change to the documentation, or if they do, I've added all required information.

@svlandeg svlandeg added feat/new New feature feat/model Feature: models labels Apr 29, 2024
@strickvl
Copy link
Author

Do you have a rough idea when this might get merged in / released? I wanted to recommend this to someone for a project but don't fully feel comfortable asking them just to do an editable install from the Git branch etc.

@cognitivetech
Copy link

👀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat/model Feature: models feat/new New feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants