The M3 Chat Go SDK is a Go client for interacting with the M3 Chat AI chat platform. It offers an easy-to-use, idiomatic Go interface compatible with the official TypeScript SDK, allowing you to send chat messages, stream responses, and batch multiple requests.
- Introduction
- Table of Contents
- Features
- Installation
- Quick Start
- Usage
- Available Models
- Development
- Contribution
- License
- Contact
- Send chat requests with configurable models and content
- Support for streaming and non-streaming responses
- Batch multiple chat requests sequentially
- Validate available models
- Easy to use and integrate into Go projects
- Compatible with Go 1.21+
go get github.com/m3-chat/go-sdk
package main
import (
"log"
"github.com/m3-chat/go-sdk/client"
"github.com/m3-chat/go-sdk/types"
)
func main() {
c := client.NewClient(&types.ClientOptions{
Stream: true,
})
err := c.GetResponse(types.RequestParams{
Model: "mistral",
Content: "Hello, how are you?",
})
if err != nil {
log.Fatal(err)
}
}
c := client.NewClient(&types.ClientOptions{
Stream: false, // Set to true to stream responses
})
err := c.GetResponse(types.RequestParams{
Model: "mistral",
Content: "Explain quantum computing in simple terms.",
})
- Returns an error if the model is invalid or the request fails.
- Streams output to stdout if
Stream
is enabled.
messages := []string{
"Who won the 2022 World Cup?",
"What is the capital of France?",
"Tell me a joke.",
}
results, err := c.BatchRequests(messages, types.BatchOptions{
Model: "dolphin3",
})
Returns an array of responses, one for each message.
The M3 Chat Go SDK internally validates models against this list:
[
"llama3:8b",
"llama2-uncensored",
"gemma3",
"gemma",
"phi3:mini",
"mistral",
"gemma:2b",
"gemma:7b",
"qwen:7b",
"qwen2.5-coder",
"qwen3",
"deepseek-coder:6.7b",
"deepseek-v2:16b",
"dolphin-mistral:7b",
"dolphin3",
"starcoder2:7b",
"magistral",
"devstral",
];
To build or contribute:
git clone https://github.com/m3-chat/go-sdk.git
cd go-sdk
go build ./...
Contributions, issues, and feature requests are welcome! Please open issues or pull requests on the GitHub repo.
This project is licensed under the Apache License 2.0 — see the LICENSE file for details.
Join the M3 Chat Discussions or open an issue on GitHub for support and questions.
Thank you for using M3 Chat Go SDK — build powerful AI chat applications with ease!