Skip to content

fix: treat zero limit.output as unknown to enable fallback to bundled…#3316

Open
ahuangsnail wants to merge 1 commit intocode-yeongyu:devfrom
ahuangsnail:fix/max-output-tokens-zero-fallback
Open

fix: treat zero limit.output as unknown to enable fallback to bundled…#3316
ahuangsnail wants to merge 1 commit intocode-yeongyu:devfrom
ahuangsnail:fix/max-output-tokens-zero-fallback

Conversation

@ahuangsnail
Copy link
Copy Markdown

@ahuangsnail ahuangsnail commented Apr 10, 2026

Summary

  • Fix maxOutputTokens must be >= 1 error when using custom OpenAI-compatible providers (e.g., infini-ai) that don't return model limits
  • Treat zero/negative limit.output values as unknown to enable fallback to bundled snapshot with correct limits
  • Aligns readRuntimeModelLimitOutput behavior with the nullish coalescing fallback chain in get-model-capabilities.ts

Changes

  • Modified readRuntimeModelLimitOutput() in src/shared/model-capabilities/runtime-model-readers.ts:
    • Changed return from readNumber(limit.output) to check for positive values only
    • Returns undefined for 0 or negative values instead of passing them through
    • This allows the ?? operator in getModelCapabilities() to fall through to bundled snapshot entries

Screenshots

Testing

bun run typecheck
bun test
To verify the fix manually:
1. Configure a custom OpenAI-compatible provider that doesn't return model limits
2. Use any agent (e.g., sisyphus) with a model from that provider
3. Confirm no maxOutputTokens must be >= 1 error occurs
4. Verify maxOutputTokens falls back to bundled snapshot values (e.g., 32768 for kimi-k2.5)
Related Issues
<!-- Add issue number if one exists, e.g., Closes #123 -->

<!-- This is an auto-generated description by cubic. -->
---
## Summary by cubic
Fixes maxOutputTokens errors with some OpenAI-compatible providers by treating zero or negative `limit.output` as unknown. `readRuntimeModelLimitOutput()` now returns `undefined` for non‑positive values so `getModelCapabilities()` falls back to bundled snapshot limits.

<sup>Written for commit 6c3308aab3904b1e91ea29fba3487e77865c689b. Summary will update on new commits.</sup>

<!-- End of auto-generated description by cubic. -->

… snapshot

When a custom OpenAI-compatible provider does not return model limits,
its provider-models.json cache entries have limit.output = 0. The nullish
coalescing operator (??) does not fallback on 0 since it is not undefined,
causing maxOutputTokens to stay at 0 and triggering the AI SDK error:
"maxOutputTokens must be >= 1".

By returning undefined for 0 or negative values, the ?? in
get-model-capabilities.ts correctly falls through to the bundled snapshot
which contains the correct output limits.
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Apr 10, 2026

All contributors have signed the CLA. Thank you! ✅
Posted by the CLA Assistant Lite bot.

Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 1 file

Confidence score: 5/5

  • Automated review surfaced no issues in the provided summaries.
  • No files require special attention.

Auto-approved: Fixes maxOutputTokens errors by treating non-positive limits as unknown, allowing fallback to bundled snapshots. Logic is sound and carries no regression risk.

@codgician
Copy link
Copy Markdown

codgician commented Apr 10, 2026

Tested locally and fixes #3247

@ahuangsnail
Copy link
Copy Markdown
Author

I have read the CLA Document and I hereby sign the CLA

github-actions bot added a commit that referenced this pull request Apr 11, 2026
@ahuangsnail
Copy link
Copy Markdown
Author

recheck

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants