Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,8 @@ cargo install moltis --git https://github.com/moltis-org/moltis

## Features

- **Multi-provider LLM support** — OpenAI Codex, GitHub Copilot, and Local
LLM through a trait-based provider architecture
- **Multi-provider LLM support** — Zaps.ai (Privacy Gateway), OpenAI Codex,
GitHub Copilot, and Local LLM through a trait-based provider architecture
- **Streaming responses** — real-time token streaming for a responsive user
experience, including when tools are enabled (tool calls stream argument
deltas as they arrive)
Expand Down
7 changes: 7 additions & 0 deletions crates/agents/src/providers/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -539,6 +539,13 @@ const OPENAI_COMPAT_PROVIDERS: &[OpenAiCompatDef] = &[
default_base_url: "https://api.venice.ai/api/v1",
models: &[],
},
OpenAiCompatDef {
config_name: "zaps",
env_key: "ZAPS_API_KEY",
env_base_url_key: "ZAPS_BASE_URL",
default_base_url: "https://zaps.ai/v1",
models: &[],
},
OpenAiCompatDef {
config_name: "ollama",
env_key: "OLLAMA_API_KEY",
Expand Down
1 change: 1 addition & 0 deletions crates/cli/src/doctor_commands.rs
Original file line number Diff line number Diff line change
Expand Up @@ -123,6 +123,7 @@ const PROVIDER_ENV_MAP: &[(&str, &str, bool)] = &[
("minimax", "MINIMAX_API_KEY", false),
("moonshot", "MOONSHOT_API_KEY", false),
("venice", "VENICE_API_KEY", false),
("zaps", "ZAPS_API_KEY", false),
("ollama", "OLLAMA_API_KEY", true),
("kimi-code", "KIMI_API_KEY", false),
];
Expand Down
13 changes: 11 additions & 2 deletions crates/config/src/template.rs
Original file line number Diff line number Diff line change
Expand Up @@ -88,8 +88,8 @@ offered = ["local-llm", "github-copilot", "openai", "anthropic", "ollama", "moon
# All available providers:
# "anthropic", "openai", "gemini", "groq", "xai", "deepseek",
# "mistral", "openrouter", "cerebras", "minimax", "moonshot",
# "venice", "ollama", "local-llm", "openai-codex", "github-copilot",
# "kimi-code"
# "venice", "zaps", "ollama", "local-llm", "openai-codex",
# "github-copilot", "kimi-code"

# ── Anthropic (Claude) ────────────────────────────────────────
# [providers.anthropic]
Expand Down Expand Up @@ -145,6 +145,15 @@ models = ["gpt-5.3", "gpt-5.2"] # Preferred models shown first
# models = ["anthropic/claude-3.5-sonnet"] # Any model IDs on OpenRouter
# base_url = "https://openrouter.ai/api/v1"

# ── Zaps.ai (PII redaction gateway) ────────────────────────────
# Routes requests through Zaps.ai's privacy gateway which
# automatically redacts PII before forwarding to upstream LLMs.
# [providers.zaps]
# enabled = true
# api_key = "gk_..." # Or set ZAPS_API_KEY env var
# models = ["gpt-4o", "deepseek-chat"] # Any model supported by Zaps gateway
# base_url = "https://zaps.ai/v1"

# ── Moonshot (Kimi) ─────────────────────────────────────────
[providers.moonshot]
# enabled = true
Expand Down
1 change: 1 addition & 0 deletions crates/config/src/validate.rs
Original file line number Diff line number Diff line change
Expand Up @@ -97,6 +97,7 @@ const KNOWN_PROVIDER_NAMES: &[&str] = &[
"minimax",
"moonshot",
"venice",
"zaps",
"ollama",
];

Expand Down
9 changes: 9 additions & 0 deletions crates/gateway/src/provider_setup.rs
Original file line number Diff line number Diff line change
Expand Up @@ -663,6 +663,15 @@ fn known_providers() -> Vec<KnownProvider> {
requires_model: true,
key_optional: false,
},
KnownProvider {
name: "zaps",
display_name: "Zaps.ai",
auth_type: "api-key",
env_key: Some("ZAPS_API_KEY"),
default_base_url: Some("https://zaps.ai/v1"),
requires_model: true,
key_optional: false,
},
KnownProvider {
name: "ollama",
display_name: "Ollama",
Expand Down
1 change: 1 addition & 0 deletions crates/gateway/src/server.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2150,6 +2150,7 @@ pub async fn start_gateway(
("minimax", "MINIMAX_API_KEY", "https://api.minimax.chat/v1"),
("moonshot", "MOONSHOT_API_KEY", "https://api.moonshot.ai/v1"),
("venice", "VENICE_API_KEY", "https://api.venice.ai/api/v1"),
("zaps", "ZAPS_API_KEY", "https://zaps.ai/v1"),
];

for (config_name, env_key, default_base) in EMBEDDING_CANDIDATES {
Expand Down
10 changes: 10 additions & 0 deletions docs/src/providers.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ Configure providers through the web UI or directly in configuration files.

| Provider | Auth | Notes |
|----------|------|-------|
| **Zaps.ai** | API Key | Privacy-first gateway (redacts PII) |
| **OpenAI Codex** | OAuth | Codex-focused cloud models |
| **GitHub Copilot** | OAuth | Requires active Copilot subscription |
| **Local LLM** | Local runtime | Runs models on your machine |
Expand Down Expand Up @@ -54,6 +55,15 @@ model = "qwen2.5-coder-7b-q4_k_m"

## Provider Setup

### Zaps.ai

Zaps.ai acts as a privacy shield between you and upstream LLM providers, automatically redacting PII.

1. Go to **Settings** -> **Providers** -> **Zaps.ai**.
2. Enter your API Key (starts with `gk_`).
3. (Optional) Set the Base URL if using a custom endpoint (default: `https://zaps.ai/v1`).
4. Enter the model ID you wish to use (e.g., `deepseek-chat`).

### OpenAI Codex

OpenAI Codex uses OAuth token import and OAuth-based access.
Expand Down