-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Closed
Description
Hi! I've built an HTTP handler adapter that wraps gollem AI agents as Echo-compatible handlers, making it easy to serve LLM-powered agents behind a standard HTTP API.
Package: github.com/fugue-labs/gollem/contrib/echohandler
What it does
- Wraps a gollem agent as an
echo.HandlerFunc - Accepts JSON requests with a
promptfield, returns the agent's response - SSE streaming support for streamed agent responses
- Handles tool execution, multi-step reasoning, and structured output transparently
Example
package main
import (
"github.com/labstack/echo/v4"
"github.com/fugue-labs/gollem/contrib/echohandler"
"github.com/fugue-labs/gollem/core"
"github.com/fugue-labs/gollem/provider/openai"
)
func main() {
model := openai.New()
agent := core.NewAgent[string](model)
e := echo.New()
e.POST("/chat", echohandler.Handler(&echohandler.AgentWrapper{Agent: agent}))
e.Start(":8080")
}Just wanted to share in case it's useful to the community. Happy to add it to any ecosystem listing if Echo maintains one.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels