-
Notifications
You must be signed in to change notification settings - Fork 299
Add LiteLLM provider support #507
Copy link
Copy link
Open
Description
Add native support for LiteLLM Proxy as a model provider. LiteLLM provides a unified OpenAI-compatible API gateway for 100+ LLM providers (OpenAI, Anthropic, Gemini, etc.) with features like load balancing, fallbacks, and spend tracking.
Current Behavior
When configuring LiteLLM as a custom provider, Spacebot sends model names with the provider prefix (e.g., litellm/kimi-k2p5), but LiteLLM expects plain model IDs (kimi-k2p5). This results in 400 errors:
400: {'error': '/chat/completions: Invalid model name passed in model=litellm/kimi-k2p5'}
Proposed Solution
Add LiteLLM to the remap_model_name_for_api() function in src/llm/model.rs to strip the provider prefix:
fn remap_model_name_for_api(provider: &str, model_name: &str) -> String {
if provider == "zai-coding-plan" {
// Coding Plan endpoint expects plain model ids (e.g. "glm-5").
model_name
.strip_prefix("zai/")
.unwrap_or(model_name)
.to_string()
} else if provider == "litellm" {
// LiteLLM expects plain model ids (e.g. "kimi-k2p5").
model_name
.strip_prefix("litellm/")
.unwrap_or(model_name)
.to_string()
} else {
model_name.to_string()
}
}Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels