Skip to content

[FEATURE]: OpenAI Compatible Providers #318

@smaEti

Description

@smaEti

Feature hasn't been suggested before.

  • I have verified this feature I'm about to request hasn't been suggested before.

Describe the enhancement you want to request

Currently, Kilo Code CLI seems to be limited to specific hardcoded providers. This restricts users from leveraging:

  1. Local Models: Running local LLMs via Ollama, LocalAI, or LM Studio for privacy and cost reasons.
  2. Alternative Providers: Using high-speed or specialized providers that use the OpenAI-standard API (like Groq, Together AI, DeepSeek, or OpenRouter).
  3. Enterprise Gateways: Corporate proxy endpoints that mimic the OpenAI schema.

I would like to see a new provider type in the configuration (e.g., openai-compatible or custom). This provider should allow users to specify:

  • base_url: The custom endpoint (e.g., http://localhost:11434/v1)
  • api_key: A field for the key (or a dummy string for local providers).
  • model: The specific model string to be passed to the API.

Currently, users are forced to use the official OpenAI/Anthropic keys, which might not be viable for developers working on sensitive codebases who prefer local inference.

If I have missed an existing setting for this, please let me know!

Thank you for your help!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions