A composable AI agent framework for Go that makes it easy to build production-ready AI applications.
Developed by Calque AI
Building AI apps in Go means wrestling with:
- Provider lock-in - Switching between OpenAI, Gemini, or local models requires rewriting code
- Conversation state - Managing chat history and context windows across requests
- Tool calling - Connecting AI to your Go functions with proper error handling
- Structured outputs - Getting reliable JSON responses that match your types
- RAG pipelines - Coordinating document retrieval, embedding, and generation
Go-Calque solves these with a simple, composable middleware pattern that feels native to Go.
go get github.com/calque-ai/go-calquepackage main
import (
"context"
"fmt"
"log"
"github.com/calque-ai/go-calque/pkg/calque"
"github.com/calque-ai/go-calque/pkg/middleware/ai"
"github.com/calque-ai/go-calque/pkg/middleware/ai/ollama"
)
func main() {
client, err := ollama.New("llama3.2:3b")
if err != nil {
log.Fatal(err)
}
flow := calque.NewFlow().Use(ai.Agent(client))
var result string
err = flow.Run(context.Background(), "What's the capital of France?", &result)
if err != nil {
log.Fatal(err)
}
fmt.Println(result)
}Three lines to set up, one line to run.
convMem := memory.NewConversation()
flow := calque.NewFlow().
Use(convMem.Input(userID)).
Use(ai.Agent(client)).
Use(convMem.Output(userID))calculator := tools.Simple("calc", "Math", calcFn)
weather := tools.Simple("weather", "Weather", weatherFn)
flow := calque.NewFlow().
Use(ai.Agent(client, ai.WithTools(calculator, weather)))flow := calque.NewFlow().
Use(ai.Agent(client, ai.WithSchema(&MyType{})))
var result MyType
flow.Run(ctx, "Analyze this", convert.FromJSONSchema(&result))flow := calque.NewFlow().
Use(retrieval.VectorSearch(store, opts)).
Use(prompt.Template(ragTemplate)).
Use(ai.Agent(client))π See Getting Started Guide β
| Challenge | Raw SDK | Go-Calque |
|---|---|---|
| Provider switching | Rewrite API calls | Change one line: ollama.New() β openai.New() |
| Conversation memory | Manual state management | convMem.Input() / convMem.Output() |
| Tool calling | Parse, match, handle errors | ai.WithTools(...) - automatic |
| Structured output | Hope AI follows instructions | ai.WithSchema() - guaranteed types |
| Retries & fallbacks | Custom logic | ctrl.Retry(), ctrl.Fallback() |
- AI Agents - OpenAI, Gemini, Ollama with unified interface
- Tool Calling - Auto-discovery and execution of Go functions
- Memory - Conversation history with configurable limits
- RAG & Retrieval - Vector search, context building, semantic filtering
- Converters - JSON, YAML, Protobuf, JSONSchema, SSE
- Flow Control - Retry, timeout, fallback, parallel, chain
- Observability - Metrics, tracing, health checks, structured logging
- MCP Support - Model Context Protocol client
- Multi-Agent - Agent routing and load balancing
π See Full Middleware Reference β
Go-Calque is built for production AI workloads where LLM latency dominates.
| Metric | Value |
|---|---|
| Framework Overhead | <0.02% at 100ms AI latency |
| Streaming | 3x faster than buffered |
| Text Processing | Up to 86% faster than hand-coded |
| Memory | 87% less allocation with streaming |
π See Benchmark Analysis β
| Guide | Description |
|---|---|
| Getting Started | Installation, quickstart, core concepts |
| Middleware Reference | All middleware packages and usage |
| Architecture | Streaming pipeline deep dive |
| Advanced Topics | Custom middleware, concurrency, error handling |
| Recipes & Examples | HTTP integration, testing, real-world examples |
| Performance | Benchmark analysis and optimization |
| Examples | Runnable code examples |
| API Reference | pkg.go.dev documentation |
| Example | Description |
|---|---|
| basics | Core flow concepts |
| ai-clients | OpenAI, Ollama, Gemini |
| streaming-chat | SSE streaming with memory |
| tool-calling | Function calling with AI |
| memory | Conversation memory |
| retrieval | RAG/vector search |
| mcp | Model Context Protocol |
- β Tool Calling - Function execution for AI agents
- β Information Retrieval - Vector search, context building, semantic filtering
- β Multi-Agent - Agent routing, load balancing, conditional routing
- β HTTP/API Integration - Streaming responses
- β Model Context Protocol - MCP client, natural language tools
- β Observability - Metrics, tracing, health checks, structured logging
- π² Guardrails & Safety - Input filtering, output validation
- π² Vector-based semantic memory
- π² Planning & reflection capabilities
- π² Anthropic/Claude support
- Fork the repository
- Create a feature branch
- Add tests for new middleware
- Submit a pull request
See AGENTS.md for development setup.
Mozilla Public License 2.0 - see LICENSE file for details.
