feat(ollama): added streaming & tool call support for ollama, updated docs#884
feat(ollama): added streaming & tool call support for ollama, updated docs#884waleedlatif1 merged 1 commit intostagingfrom
Conversation
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Greptile Summary
This PR significantly enhances the Ollama provider by adding streaming responses and tool calling capabilities, bringing it to feature parity with other providers in the system. The changes transform Ollama from a basic local model provider into a fully-featured AI integration that supports complex workflows.
The core enhancement is in the Ollama provider (apps/sim/providers/ollama/index.ts), which now implements streaming through a custom createReadableStreamFromOllamaStream helper function and includes a complete tool execution loop with proper timing tracking. The provider integrates with shared utility functions to ensure consistent behavior across all providers.
To support dynamic model management, the Ollama store (apps/sim/stores/ollama/store.ts) has been upgraded with client-side model fetching capabilities, including loading states and auto-initialization. A new API endpoint (apps/sim/app/api/providers/ollama/models/route.ts) provides a bridge between the frontend and local Ollama server for dynamic model discovery.
The UI system has been enhanced to support dynamic conditions through function-based condition evaluation in both the block type definitions and resolver. This enables sub-blocks to show/hide based on runtime state, specifically allowing the API key field to be hidden for Ollama models since they run locally and don't require authentication.
Documentation and deployment have been streamlined with updated README instructions and a comprehensive Docker Compose configuration that includes the full application stack alongside Ollama. The changes also include environment variable configuration and removal of redundant shell scripts in favor of standardized Docker Compose workflows.
Confidence score: 1/5
- This PR contains a critical runtime error that will cause the application to crash when using Ollama models
- Score severely lowered due to import-before-use violation in
apps/sim/blocks/blocks/agent.tswheregetCurrentOllamaModelsfunction usesuseOllamaStorebefore it's imported - Pay immediate attention to
apps/sim/blocks/blocks/agent.ts- the function definition on lines 16-19 must be moved after the import on line 21
14 files reviewed, 7 comments
| ollamaStream: any, | ||
| onComplete?: (content: string, usage?: any) => void |
There was a problem hiding this comment.
style: consider using more specific types instead of any for better type safety
Context Used: Context - Avoid using type assertions to 'any' in TypeScript. Instead, ensure proper type definitions are used to maintain type safety. (link)
| // Helper function to check for forced tool usage in responses | ||
| const checkForForcedToolUsage = ( | ||
| response: any, | ||
| toolChoice: string | { type: string; function?: { name: string }; name?: string; any?: any } |
There was a problem hiding this comment.
style: union type parameter is quite complex - consider extracting to a dedicated type definition
| forcedTools, | ||
| usedForcedTools | ||
| ) | ||
| hasUsedForcedTool = result.hasUsedForcedTool |
There was a problem hiding this comment.
logic: variable hasUsedForcedTool is accessed before declaration - this will cause a runtime error
| hasUsedForcedTool = result.hasUsedForcedTool | |
| // Track if a forced tool has been used | |
| let hasUsedForcedTool = false | |
| // Helper function to check for forced tool usage in responses | |
| const checkForForcedToolUsage = ( | |
| response: any, | |
| toolChoice: string | { type: string; function?: { name: string }; name?: string; any?: any } | |
| ) => { | |
| if (typeof toolChoice === 'object' && response.choices[0]?.message?.tool_calls) { | |
| const toolCallsResponse = response.choices[0].message.tool_calls | |
| const result = trackForcedToolUsage( | |
| toolCallsResponse, | |
| toolChoice, | |
| logger, | |
| 'ollama', | |
| forcedTools, | |
| usedForcedTools | |
| ) | |
| hasUsedForcedTool = result.hasUsedForcedTool | |
| usedForcedTools = result.usedForcedTools | |
| } | |
| } |
|
|
||
| // Create a new error with timing information | ||
| const enhancedError = new Error(error instanceof Error ? error.message : String(error)) | ||
| // @ts-ignore - Adding timing property to the error |
There was a problem hiding this comment.
style: using @ts-ignore bypasses type checking - consider defining a proper error interface with timing property
Context Used: Context - Avoid using type assertions to 'any' in TypeScript. Instead, ensure proper type definitions are used to maintain type safety. (link)
| // Get current Ollama models dynamically | ||
| const getCurrentOllamaModels = () => { | ||
| return useOllamaStore.getState().models | ||
| } |
There was a problem hiding this comment.
logic: This function references useOllamaStore before it's imported on line 21, which will cause a ReferenceError at runtime.
| // Get current Ollama models dynamically | ||
| const getCurrentOllamaModels = () => { | ||
| return useOllamaStore.getState().models | ||
| } | ||
|
|
||
| import { useOllamaStore } from '@/stores/ollama/store' |
There was a problem hiding this comment.
syntax: Move the import statement before the function definition to fix the reference error.
| // Get current Ollama models dynamically | |
| const getCurrentOllamaModels = () => { | |
| return useOllamaStore.getState().models | |
| } | |
| import { useOllamaStore } from '@/stores/ollama/store' | |
| import { useOllamaStore } from '@/stores/ollama/store' | |
| // Get current Ollama models dynamically | |
| const getCurrentOllamaModels = () => { | |
| return useOllamaStore.getState().models | |
| } |
| const hasUserKey = !!userProvidedKey | ||
|
|
||
| // Ollama models don't require API keys - they run locally | ||
| const isOllamaModel = provider === 'ollama' || useOllamaStore.getState().models.includes(model) |
There was a problem hiding this comment.
logic: Using useOllamaStore.getState().models.includes(model) in server-side code could cause issues if the store is not properly initialized on the server. Consider adding a null check or server-side guard
Summary
Added streaming & tool call support for ollama, updated docs
Type of Change
Testing
Tested manually
Checklist
Screenshots/Videos