Replies: 1 comment
-
|
Hey! Llamafile exposes an OpenAI-compatible API so this is pretty straightforward. First, point RubyLLM at your llamafile server: # config/initializers/ruby_llm.rb
RubyLLM.configure do |config|
config.openai_api_key = "not-needed" # llamafile doesn't need a key but RubyLLM expects one
config.openai_api_base = "http://localhost:8080/v1"
endThen install the gem and create your agent: # app/agents/application_agent.rb
class ApplicationAgent < RubyLLM::Agents::Base
end
# app/agents/weather_agent.rb
class WeatherAgent < ApplicationAgent
model "your-llamafile-model" # whatever model name llamafile reports
system "You are a helpful weather assistant. When asked about weather, use the get_weather tool to fetch current conditions, then answer naturally."
user "{question}"
tools GetWeather
endFor the weather tool, you'd create something like: # app/tools/get_weather.rb
class GetWeather < RubyLLM::Tool
description "Get current weather for a location"
param :location, desc: "City name"
def execute(location:)
# Call a weather API (e.g. OpenWeatherMap, wttr.in)
response = Net::HTTP.get(URI("https://wttr.in/#{URI.encode_www_form_component(location)}?format=j1"))
data = JSON.parse(response)
current = data["current_condition"][0]
"#{current["temp_C"]}°C, #{current["weatherDesc"][0]["value"]}"
end
endThen just call it: WeatherAgent.call(question: "Tell me what temperature is now in Warsaw")
# => Agent calls get_weather tool, gets the data, responds naturallyOne heads up — tool calling depends on your llamafile model actually supporting function calling. Smaller models might struggle with that. If yours doesn't support tools, you could skip the tool and just ask it directly, or use a two-step approach where one agent fetches weather via code and another one summarizes it. The wiki has more examples if you want to dig deeper. Let me know if you hit any issues! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I use https://github.com/mozilla-ai/llamafile
simple one file llm text model on localhost:8080 v1
How writing a simple agent for reading weather and tell me.
'Tell me what temperature is now'
'Will it rain tomorrow?'
Should I take an umbrella to work on Wednesday?
Beta Was this translation helpful? Give feedback.
All reactions