Skip to content

fix: correct Copilot GPT-5.2 token limits#12273

Open
dlukt wants to merge 4 commits intoanomalyco:devfrom
dlukt:fix/copilot-gpt-5-2-context
Open

fix: correct Copilot GPT-5.2 token limits#12273
dlukt wants to merge 4 commits intoanomalyco:devfrom
dlukt:fix/copilot-gpt-5-2-context

Conversation

@dlukt
Copy link

@dlukt dlukt commented Feb 5, 2026

What does this PR do?

Fixes #11086

GitHub Copilot’s gpt-5.2 / gpt-5.2-codex models are currently ingested from models.dev with incorrect limit values, which makes OpenCode think the context window is much smaller than it really is. Since OpenCode uses model.limit.context / model.limit.output for both the context meter and auto-compaction overflow checks, this shows an inflated % used and triggers compaction far too early (e.g. ~90–100k tokens). (Refs #11086, duplicate #12247.)

This PR fixes that by overriding the token limits during models.dev -> Provider.Model conversion, but only for providerID starting with github-copilot and only for model IDs gpt-5.2 and gpt-5.2-codex. The override sets:

  • context: 400000
  • input: 272000
  • output: 128000

This works because every downstream consumer (UI usage, overflow/compaction thresholds, max output calculations) reads from Provider.Model.limit. Correcting the limits at ingestion time fixes behavior without changing session logic, and the scope is intentionally narrow to avoid impacting other Copilot models.

How did you verify your code works?

  • Added a regression test that asserts the override is applied for both github-copilot/gpt-5.2 and github-copilot/gpt-5.2-codex.
  • Ran cd packages/opencode && bun test test/provider/provider.test.ts.

@github-actions
Copy link
Contributor

github-actions bot commented Feb 5, 2026

Thanks for your contribution!

This PR doesn't have a linked issue. All PRs must reference an existing issue.

Please:

  1. Open an issue describing the bug/feature (if one doesn't exist)
  2. Add Fixes #<number> or Closes #<number> to this PR description

See CONTRIBUTING.md for details.

@github-actions
Copy link
Contributor

github-actions bot commented Feb 5, 2026

The following comment was made by an LLM, it may be inaccurate:

No duplicate PRs found

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

GPT 5.2 Codex - Compaction happens way too early and context limit is wrong

1 participant