This package defines a partial Ollama client with rich feature set around chat requests with Ollama. It was created to experiment with the new tool calling functionality in Llama.cpp and Ollama.
See example/orderbot for a full example of how to bind a Go function and provide it as a tool. A simpler example of tool use can be found in example/tick.
First, you must install the client as a dependency of your project:
go install github.com/swdunlop/ollama-client
Then, in many circumstances, you can just use the ollama.Chat
function.
import "github.com/swdunlop/ollama-client"
ret, _ := ollama.Chat(
context.TODO(),
chat.Model(`llama3.1:latest`),
chat.Message(`user`, `what is the airspeed of an unladen swallow?`),
)
This will connect to the Ollama instance running locally and run the request. See the examples for more involved examples using tools and other features.