Skip to content

Conversation

@Haleshot
Copy link

Hey team! Following up on this issue; went with the in-codebase route since Anannas fits the same pattern as OpenRouter and similar OpenAI-compatible services.

This PR adds the service class at src/pipecat/services/anannas/llm.py; a function-calling example adapted from the OpenRouter one; added to pyproject.toml as an optional dependency; updated env.example with the API key variable.

Links for reference:

Will also submit a docs (pipecat/docs)) PR to add the anannas page under LLM services.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm curious re: the naming convention; should this be renamed to 14x-function-calling-anannas?

@markbackman
Copy link
Contributor

Hi @Haleshot, Anannas should be a community integration. Do you mind moving all of this to a repo that you maintain? Are you part of the team building Anannas? If so, that's great! You can add it to your company's github which will help with reputability. If not, no worries, we can still add it.

As a reminder, see this guide for how to set up your repo and what's required to submit:
https://github.com/pipecat-ai/pipecat/blob/main/COMMUNITY_INTEGRATIONS.md

@Haleshot
Copy link
Author

Haleshot commented Oct 14, 2025

Do you mind moving all of this to a repo that you maintain?

I do not mind moving it to our org's repo. The reason I had gone with the in-codebase approach is because I felt it was similar to the existing integrations present in pipecat/services. I read through the Contributing.md file listed and after seeing the existing documentation too, felt that it (anannas) wouldn't aptly fit there (atleast not from the current categories present).

Any reason for suggesting to go with the new repo route? Just curious (hope this doesn't come across rudely). I suppose maintainability (down the line) + adding to pipecat's repo being a third party provider, etc. are some reasons?

Are you part of the team building Anannas?

I am, yes.

Unlike the bey integration which was recently merged; I felt we don't fit in any of the existing categories listed (TTS, STT, LLMs, etc.) and fit better in the pipecat/services.

@Haleshot
Copy link
Author

We also do not have an installable PyPi package; we're an Openrouter alternative — a unified API to access any model. Would love to know your final thoughts on where you believe this is more suited.

@markbackman
Copy link
Contributor

The reason is that we simply can't keep up with the number of integrations that are launching. Once something is merged into Pipecat, it becomes our responsibility to test, maintain, and support. But, we don't want to limit developers choices when it comes to options, so we want help in maintaining integrations.

Are you not an LLM service? Based on the code you submitted, you're using the OpenAI client to provide completions. I think this is a pretty good categorization for you.

You could create a PyPI package for this integration, which would make it easier to use. This is a pretty low effort thing that would pay off in terms of ease of access.

Hope these answers help explain things.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants