Skip to content

Conversation

@SamPink
Copy link
Contributor

@SamPink SamPink commented Apr 7, 2024

Using the LLM abstraction of llamaindex i've added support for more models.

I tried to keep everything else the same as is for simplicity.

I've already tested "anthropic:claude-3-haiku-20240307" vs groq:gemma-7b-it - haiku won!

Note: im not sure how phospho.lab, get_provider_and_model, get_sync_client was being used

@oulianov
Copy link
Contributor

oulianov commented Apr 12, 2024

Very cool ! Can you please remove the .vscode folder from your PR ?
Once this is done I'll merge it, this is a nice addition thank you

@SamPink
Copy link
Contributor Author

SamPink commented Apr 15, 2024

Done this now!

@oulianov
Copy link
Contributor

Super cool !
There is just one more edit to make sure there is no regression for weird model names in ollama (that have : in their names)
I will give you the snippet to copy paste and we should be good

SamPink and others added 2 commits April 15, 2024 09:34
Added support for models with : in the name
@SamPink
Copy link
Contributor Author

SamPink commented Apr 15, 2024

Didn't do that properly, just fixed the bigs there!

@oulianov
Copy link
Contributor

Cool ! Thank you again for your help.

@oulianov oulianov merged commit 5701efb into OpenGenerativeAI:main Apr 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants