This repository was archived by the owner on Oct 23, 2025. It is now read-only.
Conversation
Implemented Ollama provider as an alternative to OpenRouter for local LLM usage. Added complete Ollama class in src/agentic_ai/llm_provider/ollama.py mirroring OpenRouter's interface but using local Ollama API (http://localhost:11434/v1). Updated all examples to use Ollama backend instead of OpenRouter. Users can now switch between providers by changing llm_backend parameter from "OpenRouter" to "ollama". **Limitations** - Multimodal models on Ollama can handle images but not PDFs. - Model choice determines capability. Some models aren't multimodal so they can only process text. Some models don't support tool-calling. I use qwen3:8b as it is a light reasoning model for most examples and gemma3n:latest for the file-upload with the image.
…providers - Created OpenAIProvider base class for OpenAI SDK compatibility - Refactored Ollama and OpenRouter to inherit from OpenAIProvider - Eliminated ~95% code duplication between providers - Maintained full backward compatibility with existing examples
|
Important Review skippedDraft detected. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the Note 🎁 Summarized by CodeRabbit FreeThe PR author is not assigned a seat. To perform a comprehensive line-by-line review, please assign a seat to the pull request author through the subscription management page by visiting https://app.coderabbit.ai/login. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Join our Discord community for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
- Modified all examples to accept --backend and --model CLI arguments - Updated tests/run_examples.py to test ollama and openrouter backends - Added support for testing specific backends via CLI flags - Examples now support ollama, openrouter, and openai backends Testing instructions: - Run all tests: python3 -m tests.run_examples - Test specific backend: python3 -m tests.run_examples --backend ollama --model qwen3:8b - Test openrouter: python3 -m tests.run_examples --backend openrouter --model google/gemini-2. 5-pro"
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR proposes an unified interface for the different LLM serving frameworks we are planning to support. It adds tets, a OpenRoute connector and high/low level examples.