-
-
Notifications
You must be signed in to change notification settings - Fork 380
Closed as not planned
Labels
bugSomething isn't workingSomething isn't working
Description
Basic checks
- I searched existing issues - this hasn't been reported
- I can reproduce this consistently
- This is a RubyLLM bug, not my application code
What's broken?
require 'ruby_llm'
internal:/usr/lib/ruby/vendor_ruby/rubygems/core_ext/kernel_require.rb:85:in require' ruby_llm-1.3.1/lib/ruby_llm/provider.rb:24: no anonymous block parameter (SyntaxError) from <internal:/usr/lib/ruby/vendor_ruby/rubygems/core_ext/kernel_require.rb>:85:in require'
I use many way local model vllm, lmstudio.ai, llama.cpp and my favorite llamafile https://github.com/mozilla-ai/llamafile
This is simple openai 'v1' http interface.
<internal:/usr/lib/ruby/vendor_ruby/rubygems/core_ext/kernel_require.rb>:85:in `require': cannot load such file -- ruby_llm (LoadError)
Did you mean? rubygems
from <internal:/usr/lib/ruby/vendor_ruby/rubygems/core_ext/kernel_require.rb>:85:in `require'
from (irb):1:in `<main>'
from /usr/lib/ruby/gems/3.1.0/gems/irb-1.4.1/exe/irb:11:in `<top (required)>'
from /usr/bin/irb:25:in `load'
from /usr/bin/irb:25:in `<main>'
$irb
001> require 'ruby_llm'
002* RubyLLM.configure do |config|
003* config.ollama_api_base = 'http://localhost:8111/v1'
004> end
=> "http://localhost:8111/v1"
005> chat = RubyLLM.chat
006> chat.ask "What's the best way to learn Ruby?"
gems/ruby_llm-1.12.1/lib/ruby_llm/provider.rb:242:in `ensure_configured!': Missing configuration for OpenAI: openai_api_key (RubyLLM::ConfigurationError)
from /gems/gems/ruby_llm-1.12.1/lib/ruby_llm/provider.rb:12:in `initialize'
from /gems/gems/ruby_llm-1.12.1/lib/ruby_llm/models.rb:133:in `new'
from /gems/gems/ruby_llm-1.12.1/lib/ruby_llm/models.rb:133:in `resolve'
from /gems/gems/ruby_llm-1.12.1/lib/ruby_llm/chat.rb:66:in `with_model'
from /gems/gems/ruby_llm-1.12.1/lib/ruby_llm/chat.rb:18:in `initialize'
from /gems/gems/ruby_llm-1.12.1/lib/ruby_llm.rb:51:in `new'
from /gems/gems/ruby_llm-1.12.1/lib/ruby_llm.rb:51:in `chat'
from (irb):5:in `<main>'
from <internal:kernel>:187:in `loop'
from /usr/lib/ruby/gems/3.3.0/gems/irb-1.13.1/exe/irb:9:in `<top (required)>'
from /usr/bin/irb:25:in `load'
from /usr/bin/irb:25:in `<main>'How to reproduce
run local model on 127.0.0.1:8111
run script
#!/usr/bin/env ruby
# encoding: UTF-8
require 'ruby_llm'
RubyLLM.configure do |config|
config.ollama_api_base = 'http://localhost:8090/v1'
end
chat = RubyLLM.chat
chat.ask "What's the best way to learn Ruby?"Expected behavior
correct config local models,
best way is add NEW way to config local models
What actually happened
not run
Environment
ubuntu
ruby 3.3.8 (2025-04-09 revision b200bad6cd) [x86_64-linux-gnu]
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working