Skip to content

[Model]: Copilot enterprise tokens return 421 unless using proxy endpoint + streaming #352

@lijunle

Description

@lijunle

Preflight Checklist

  • I have searched existing issues for similar reports
  • This report does NOT contain sensitive information (API keys, passwords, personal data)
  • If this happened during a chat session, I included as much full session context as possible

Type of behavior issue

Unexpected behavior with a specific LLM provider

What you asked the agent to do

Summary
Using Moltis with GitHub Copilot Enterprise tokens fails with HTTP 421 Misdirected Request because requests are sent to api.individual.githubcopilot.com. Enterprise tokens must use the proxy endpoint (proxy.enterprise.githubcopilot.com). Also, the proxy endpoint requires streaming for chat completions (stream: true); non‑streaming returns 400: "stream": false is not supported".

Steps to Reproduce

  1. Authenticate Moltis with a Copilot Enterprise account.
  2. Try any Copilot chat request.
  3. Observe failure.

Observed Behavior

  • https://api.individual.githubcopilot.com/modelsHTTP 421 Misdirected Request
  • https://api.individual.githubcopilot.com/chat/completionsHTTP 421 Misdirected Request

Expected Behavior

  • Enterprise tokens should be routed via proxy endpoint from token response (proxy-ep / enterprise proxy).
  • Chat completions should use streaming on the proxy endpoint.

Verified Workaround

  • Use enterprise proxy:
    • GET https://proxy.enterprise.githubcopilot.com/models200 OK
    • POST https://proxy.enterprise.githubcopilot.com/chat/completions with stream: true200 OK
  • Non‑streaming on proxy returns:
    • 400: "stream": false is not supported"

Relevant Code

const GITHUB_DEVICE_CODE_URL: &str = "https://github.com/login/device/code";
const GITHUB_TOKEN_URL: &str = "https://github.com/login/oauth/access_token";
const COPILOT_TOKEN_URL: &str = "https://api.github.com/copilot_internal/v2/token";
const COPILOT_API_BASE: &str = "https://api.individual.githubcopilot.com";
const COPILOT_MODELS_ENDPOINT: &str = "https://api.individual.githubcopilot.com/models";

Currently hardcodes:

  • https://api.individual.githubcopilot.com/...
  • Headers:
    • Editor-Version: vscode/1.96.2
    • User-Agent: GitHubCopilotChat/0.26.7

Suggested Fix

  • Use proxy-ep from the token response when present (enterprise tokens).
  • Route requests to enterprise proxy endpoint.
  • Ensure streaming for chat/completions on enterprise proxy (non‑streaming returns 400).
  • Keep api.individual.githubcopilot.com for non‑enterprise tokens.

Additional Notes

  • Updating headers to newer versions didn’t fix 421:
    • Editor-Version: vscode/1.110.1
    • User-Agent: GitHubCopilotChat/0.38.2

Can you reproduce this?

Yes, every time with the same prompt

LLM provider and model

Copilot

Moltis version

0.10.17

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions