-
Notifications
You must be signed in to change notification settings - Fork 30
Open
Description
Feature hasn't been suggested before.
- I have verified this feature I'm about to request hasn't been suggested before.
Describe the enhancement you want to request
Currently, Kilo Code CLI seems to be limited to specific hardcoded providers. This restricts users from leveraging:
- Local Models: Running local LLMs via Ollama, LocalAI, or LM Studio for privacy and cost reasons.
- Alternative Providers: Using high-speed or specialized providers that use the OpenAI-standard API (like Groq, Together AI, DeepSeek, or OpenRouter).
- Enterprise Gateways: Corporate proxy endpoints that mimic the OpenAI schema.
I would like to see a new provider type in the configuration (e.g., openai-compatible or custom). This provider should allow users to specify:
base_url: The custom endpoint (e.g.,http://localhost:11434/v1)api_key: A field for the key (or a dummy string for local providers).model: The specific model string to be passed to the API.
Currently, users are forced to use the official OpenAI/Anthropic keys, which might not be viable for developers working on sensitive codebases who prefer local inference.
If I have missed an existing setting for this, please let me know!
Thank you for your help!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels