-
-
Notifications
You must be signed in to change notification settings - Fork 756
Fix #854: Ollama sequential tool calling failing #857
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix #854: Ollama sequential tool calling failing #857
Conversation
|
Important Review skippedBot user detected. To trigger a single review, invoke the You can disable this status message by setting the 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Join our Discord community for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
|
@claude review this pull request and do a detailed analysis and fix this if the existing code doesn't have the solution implemented. Making sure it has backward compatibility, no existing features removed. After making those changes, again review the applied changes. Use @web to search if you dont know any information or to find the latest documentation or to find the latest version. Run the code if you think you need to run it to test it. Minimal code change to start with if required any changes. |
|
Claude encountered an error —— View job I'll analyze this and get back to you. |
|
@gemini read the code and provide a review if the implementation is correct |
📋 Review SummaryThis pull request provides a crucial fix for a bug where Ollama models would fail during sequential tool calls due to context loss. The change correctly preserves the full conversation history, ensuring the model can proceed with the sequence as expected. 🔍 General FeedbackThe proposed solution is clean, direct, and effectively resolves the identified issue. By concatenating the existing message history with the new tool-related messages, the context is maintained for subsequent model calls. The change is consistently applied to both the synchronous and asynchronous functions, ensuring uniform behavior. 🎯 Specific FeedbackNo specific issues identified in this review. The implementation is correct and follows best practices. ✅ Highlights
Review completed by Gemini CLI |
|
@gemini-cli review |
📋 Review SummaryThis pull request addresses a bug in the sequential tool calling feature for Ollama models, where the conversation history was not being correctly preserved. The fix ensures the full message history is passed to the model in subsequent calls, which effectively resolves the context loss issue. 🔍 General FeedbackThe proposed change is well-implemented, clear, and directly solves the reported problem. By concatenating the existing history with the new messages, the fix is both simple and effective. It's good to see that the solution was consistently applied to both the synchronous ( 🎯 Specific FeedbackNo specific issues identified in this review. ✅ Highlights
Review completed by Gemini CLI |
|
|
@gemini i get above error after implementing the changes you made |
2 similar comments
|
@gemini i get above error after implementing the changes you made |
|
@gemini i get above error after implementing the changes you made |
Fixes #854
Summary of changes: