Skip to content

Google Gemini support ?? #512

Open
Open
@jaiswalvineet

Description

For my use case, I am using dynamic routes that perform LLM calls to determine input arguments. While reviewing the code, I found that these are the LLM-supported:

├── __init__.py
├── base.py
├── cohere.py
├── llamacpp.py
├── mistral.py
├── ollama.py
├── openai.py
├── openrouter.py
└── zure.py

I only have access to Google Gemini, which also follows OpenAI specs but has a different client setup. Is there a way to use Gemini with the current setup? Alternatively, does this library have any planned extensions for Gemini in the future, or would we need to extend the base class in our code to implement Gemini-specific functionality?

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions