Skip to content

Fix streaming tool call nil arguments#587

Open
afurm wants to merge 1 commit intocrmne:mainfrom
afurm:fix/streaming-tool-call-nil-args
Open

Fix streaming tool call nil arguments#587
afurm wants to merge 1 commit intocrmne:mainfrom
afurm:fix/streaming-tool-call-nil-args

Conversation

@afurm
Copy link
Contributor

@afurm afurm commented Jan 28, 2026

What this does

Fixes a streaming crash when tool-call deltas omit function.arguments by normalizing nil/empty fragments during accumulation, and adds a regression spec to cover the case.

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Required for new features

  • I opened an issue before writing code and received maintainer approval
  • Linked issue: #___

PRs for new features or enhancements without a prior approved issue will be closed.

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
    • For provider changes: Re-recorded VCR cassettes with bundle exec rake vcr:record[provider_name]
    • All tests pass: bundle exec rspec
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

AI-generated code

  • I used AI tools to help write this code
  • I have reviewed and understand all generated code (required if above is checked)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

@aka-nez
Copy link

aka-nez commented Feb 4, 2026

Faced exactly the same issue with openrouter provider

@smerickson
Copy link

I had been running into issues using Haiku 4.5 at AWS Bedrock while streaming and using tool calls. The way it manifested was the model calling a tool with incorrect parameters. I always had a sneaking suspicion that it wasn't model hallucination because I had never experienced that when I was using the AWS SDK directly instead of RubyLLM. I just applied this patch to my system and my sreaming issues with Haiku seen to have been resolved. this would be great to get merged into core.

@afurm
Copy link
Contributor Author

afurm commented Feb 4, 2026

Thanks for confirming and for testing this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants