Skip to content

Use a local LLM proxy to aggregate AI models #93

Open

Description

Describe the feature you'd like to request

Allow administrator to set up access to multiple LLMs in nextcloud via proxy

Describe the solution you'd like

LITELLM does an incredible job of this.

Combination in Nextcloud would allow more options than the current OpenAI, LocalAI and Replicate options but would also provide a single AI abstraction layer and allow integration with Nextcloud user management, etc

LiteLLM's model aliases provide a useful way to have and call one proxy with unlimited Apis beneath.

The integration could allow integration with Nextcloud group permissions and tasks, allowing other apps to call the various AI models via alias and permissions through the abstraction layer.

If the aliases are exposed as bots then they could be called in Talk, etc

This dovetails with the following Nextcloud Assistant enhancement request - nextcloud/assistant#76

Describe alternatives you've considered

Currently calling LiteLLM via the Nextcloud LocalAI integration

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions