-
Notifications
You must be signed in to change notification settings - Fork 1.4k
feat: Add Anannas AI LLM service integration
#2848
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm curious re: the naming convention; should this be renamed to 14x-function-calling-anannas?
|
Hi @Haleshot, Anannas should be a community integration. Do you mind moving all of this to a repo that you maintain? Are you part of the team building Anannas? If so, that's great! You can add it to your company's github which will help with reputability. If not, no worries, we can still add it. As a reminder, see this guide for how to set up your repo and what's required to submit: |
I do not mind moving it to our org's repo. The reason I had gone with the in-codebase approach is because I felt it was similar to the existing integrations present in Any reason for suggesting to go with the new repo route? Just curious (hope this doesn't come across rudely). I suppose maintainability (down the line) + adding to pipecat's repo being a third party provider, etc. are some reasons?
I am, yes. Unlike the bey integration which was recently merged; I felt we don't fit in any of the existing categories listed (TTS, STT, LLMs, etc.) and fit better in the |
|
We also do not have an installable |
|
The reason is that we simply can't keep up with the number of integrations that are launching. Once something is merged into Pipecat, it becomes our responsibility to test, maintain, and support. But, we don't want to limit developers choices when it comes to options, so we want help in maintaining integrations. Are you not an LLM service? Based on the code you submitted, you're using the OpenAI client to provide completions. I think this is a pretty good categorization for you. You could create a PyPI package for this integration, which would make it easier to use. This is a pretty low effort thing that would pay off in terms of ease of access. Hope these answers help explain things. |
Hey team! Following up on this issue; went with the in-codebase route since Anannas fits the same pattern as OpenRouter and similar OpenAI-compatible services.
This PR adds the service class at
src/pipecat/services/anannas/llm.py; a function-calling example adapted from the OpenRouter one; added topyproject.tomlas an optional dependency; updatedenv.examplewith the API key variable.Links for reference:
Will also submit a docs (
pipecat/docs)) PR to add theanannaspage under LLM services.