This is a CLI application written in Go that uses a local LLM (phi3 via Ollama) to sum two numbers. It demonstrates how to integrate Go with local LLMs using LangChain for Go (langchaingo).
- Go: Ensure you have Go installed (version 1.25.3 or later recommended).
- Ollama: You need to have Ollama installed and running.
- Phi3 Model: GitHub Codespaces have limited resources, and we found this model to be a good compromise between accuracy and resourse needs. Pull the required model:
ollama pull phi3
- Clone the repository (if applicable) or navigate to the project directory.
- Install dependencies:
go mod tidy
- Build the application:
go build -o ai-summator main.go
Run the built binary with two numeric arguments:
./ai-summator 5 3Example output:
Result: 8.000000
Floating point numbers are supported:
./ai-summator 1.5 2.7The project includes both unit tests and integration tests.
To run all tests:
go test -v ./...Note: The integration tests require Ollama to be running and the phi3 model to be available. If Ollama is not reachable, the integration test will fail.
main.go: Entry point for the CLI.summator/: Contains the core logic and tests.summator.go: Implementation of the summator usinglangchaingo.summator_test.go: Unit tests with mocked LLM.integration_test.go: Integration tests against a real Ollama instance.
This project includes a DevContainer configuration. You can open this project in GitHub Codespaces or VS Code with the Dev Containers extension.
The DevContainer is configured to:
- Install Go.
- Install Ollama.
- Automatically start the Ollama server.
- Pull the
phi3model during the creation phase.
Note: Running LLMs in a cloud environment (like standard Codespaces) might be slow due to lack of GPU acceleration.