Skip to content

calque-ai/go-calque

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

272 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Go-Calque

Go-Calque

Pre-release Go Version Go Report Card Go Reference Build Status Code Coverage License: MPL 2.0 Discord

A composable AI agent framework for Go that makes it easy to build production-ready AI applications.

Developed by Calque AI

The Problem

Building AI apps in Go means wrestling with:

  • Provider lock-in - Switching between OpenAI, Gemini, or local models requires rewriting code
  • Conversation state - Managing chat history and context windows across requests
  • Tool calling - Connecting AI to your Go functions with proper error handling
  • Structured outputs - Getting reliable JSON responses that match your types
  • RAG pipelines - Coordinating document retrieval, embedding, and generation

Go-Calque solves these with a simple, composable middleware pattern that feels native to Go.

Installation

go get github.com/calque-ai/go-calque

Quickstart

package main

import (
    "context"
    "fmt"
    "log"

    "github.com/calque-ai/go-calque/pkg/calque"
    "github.com/calque-ai/go-calque/pkg/middleware/ai"
    "github.com/calque-ai/go-calque/pkg/middleware/ai/ollama"
)

func main() {
    client, err := ollama.New("llama3.2:3b")
    if err != nil {
        log.Fatal(err)
    }

    flow := calque.NewFlow().Use(ai.Agent(client))

    var result string
    err = flow.Run(context.Background(), "What's the capital of France?", &result)
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println(result)
}

Three lines to set up, one line to run.

What You Can Build

Chatbot with Memory

convMem := memory.NewConversation()

flow := calque.NewFlow().
    Use(convMem.Input(userID)).
    Use(ai.Agent(client)).
    Use(convMem.Output(userID))

AI with Tool Calling

calculator := tools.Simple("calc", "Math", calcFn)
weather := tools.Simple("weather", "Weather", weatherFn)

flow := calque.NewFlow().
    Use(ai.Agent(client, ai.WithTools(calculator, weather)))

Structured Output

flow := calque.NewFlow().
    Use(ai.Agent(client, ai.WithSchema(&MyType{})))

var result MyType
flow.Run(ctx, "Analyze this", convert.FromJSONSchema(&result))

RAG Pipeline

flow := calque.NewFlow().
    Use(retrieval.VectorSearch(store, opts)).
    Use(prompt.Template(ragTemplate)).
    Use(ai.Agent(client))

πŸ“– See Getting Started Guide β†’

Why Go-Calque?

Challenge Raw SDK Go-Calque
Provider switching Rewrite API calls Change one line: ollama.New() β†’ openai.New()
Conversation memory Manual state management convMem.Input() / convMem.Output()
Tool calling Parse, match, handle errors ai.WithTools(...) - automatic
Structured output Hope AI follows instructions ai.WithSchema() - guaranteed types
Retries & fallbacks Custom logic ctrl.Retry(), ctrl.Fallback()

Features

Core

  • AI Agents - OpenAI, Gemini, Ollama with unified interface
  • Tool Calling - Auto-discovery and execution of Go functions
  • Memory - Conversation history with configurable limits

Data Processing

Production

πŸ“– See Full Middleware Reference β†’

Performance

Go-Calque is built for production AI workloads where LLM latency dominates.

Metric Value
Framework Overhead <0.02% at 100ms AI latency
Streaming 3x faster than buffered
Text Processing Up to 86% faster than hand-coded
Memory 87% less allocation with streaming

πŸ“Š See Benchmark Analysis β†’

Documentation

Guide Description
Getting Started Installation, quickstart, core concepts
Middleware Reference All middleware packages and usage
Architecture Streaming pipeline deep dive
Advanced Topics Custom middleware, concurrency, error handling
Recipes & Examples HTTP integration, testing, real-world examples
Performance Benchmark analysis and optimization
Examples Runnable code examples
API Reference pkg.go.dev documentation

Examples

Example Description
basics Core flow concepts
ai-clients OpenAI, Ollama, Gemini
streaming-chat SSE streaming with memory
tool-calling Function calling with AI
memory Conversation memory
retrieval RAG/vector search
mcp Model Context Protocol

Roadmap

Middleware

  • βœ… Tool Calling - Function execution for AI agents
  • βœ… Information Retrieval - Vector search, context building, semantic filtering
  • βœ… Multi-Agent - Agent routing, load balancing, conditional routing
  • βœ… HTTP/API Integration - Streaming responses
  • βœ… Model Context Protocol - MCP client, natural language tools
  • βœ… Observability - Metrics, tracing, health checks, structured logging
  • πŸ”² Guardrails & Safety - Input filtering, output validation

Framework

  • πŸ”² Vector-based semantic memory
  • πŸ”² Planning & reflection capabilities
  • πŸ”² Anthropic/Claude support

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Add tests for new middleware
  4. Submit a pull request

See AGENTS.md for development setup.

Contributors

Star History

Star history of calque-ai/go-calque over time

License

Mozilla Public License 2.0 - see LICENSE file for details.

About

A composable multi-agent AI framework for Go that makes it easy to build production-ready AI applications.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors