Skip to content

Conversation

@mnort9
Copy link
Contributor

@mnort9 mnort9 commented Dec 10, 2025

What this does

Adds a with_messages method to RubyLLM::Chat that allows filtering messages before they're sent to the provider. This enables fine-grained control over which messages are included in API requests, which is particularly useful for managing context windows, implementing sliding window strategies, or selectively including only recent messages.

Ref discussion: #495

I considered creating a new chat and associating them, but I ended up not wanting to mix the application chat/message data model with something only needed for the llm call. This approach keeps message scoping logic separate from the application's data model.

When called with a block, the block receives all messages and returns the filtered subset that will be sent to the provider. When called without a block, it clears any previously set scope and returns to using all messages.

Usage example:

# Scope to only the most recent 3 messages
chat = chat.with_messages { |msgs| msgs.last(3) }
chat.complete # Only sends the most recent 3 messages to the provider

# Clear the scope to use all messages again
chat = chat.with_messages
chat.complete # Uses all messages again

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
    • For provider changes: Re-recorded VCR cassettes with bundle exec rake vcr:record[provider_name]
    • All tests pass: bundle exec rspec
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

@smerickson
Copy link

This is super cool and would be very helpful for doing more advanced context engineering. You could even imagine packaging up some of the more common patterns (blocks) as a separate RubyLLM library.

Reminds me a bit about middleware in Langchain. https://docs.langchain.com/oss/python/langchain/middleware/built-in

I had been starting to do some thinking about what a "middleware" type architecture, similar to Rack middleware, would look like for RubyLLM. That would possibly allow for more hooks and advanced functionality to be build on top of RubyLLM without the need for those things needing to be in "core".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants