feat(provider): add Vercel AI Gateway support#3376
feat(provider): add Vercel AI Gateway support#3376code-yeongyu merged 8 commits intocode-yeongyu:devfrom
Conversation
Adds 'vercel' as a recognized provider throughout the model resolution system, enabling users with Vercel AI Gateway to use it as a universal fallback for OpenAI, Anthropic, and Google models. The gateway transform infers the sub-provider from canonical model names (claude->anthropic, gpt->openai, gemini->google) and delegates to the appropriate provider-specific transform, producing model strings like vercel/anthropic/claude-opus-4.6.
|
All contributors have signed the CLA. Thank you! ✅ |
|
I have read the CLA Document and I hereby sign the CLA |
|
recheck |
The gateway uses google/gemini-3-flash (no -preview suffix) unlike the direct Google API. Give the vercel provider its own transform instead of delegating to sub-provider transforms blindly.
Adds vercel to every fallback entry where the model exists on the gateway (kimi-k2.5, glm-5, glm-4.6v, minimax-m2.7, grok-code-fast-1, gpt-5-nano). Adds inferSubProvider mappings for minimax, moonshotai, and zai. Updates librarian/explore special cases to prefer minimax via gateway over claude-haiku.
…c regex Claude version transforms now use a single regex claude-(\w+)-(\d+)-(\d+) -> claude-$1-$2.$3 instead of one .replace() per model. New claude models need zero changes.
There was a problem hiding this comment.
No issues found across 13 files
Confidence score: 5/5
- Automated review surfaced no issues in the provided summaries.
- No files require special attention.
Auto-approved: Comprehensive implementation of Vercel AI Gateway support with extensive unit tests, snapshot updates, and correctly handled model name transformations across providers.
There was a problem hiding this comment.
0 issues found across 5 files (changes from recent commits).
Requires human review: Changes model ID formatting for existing Anthropic provider (switching from dots to hyphens), which may cause regressions in API calls if downstream clients expect the previous format.
- Add --vercel-ai-gateway CLI option to install command - Fix librarian/explore priority: native providers (ZAI, Copilot) now take precedence over Vercel gateway - Add hasVercelAiGateway to installer no-provider warning conditions - Add hasVercelAiGateway: false to all test file InstallConfig fixtures - Fix test expectations for model ID format (claude-opus-4.6 vs claude-opus-4-6)
ff42fe5 to
e3b5c2b
Compare
There was a problem hiding this comment.
No issues found across 22 files
Confidence score: 5/5
- Automated review surfaced no issues in the provided summaries.
- No files require special attention.
Requires human review: The PR modifies existing model naming conventions for native Anthropic providers (changing 4-6 to 4.6), which could cause regressions in systems expecting the previous format.
Adds
vercelas a recognized provider throughout the model resolution system, enabling users with a Vercel AI Gateway API key to use it as a universal fallback. The gateway proxies all supported providers (OpenAI, Anthropic, Google, xAI, MiniMax, Moonshot AI, Z.ai) with automatic multi-provider routing and failover.The provider string is
verceland model IDs follow thevercel/<sub-provider>/<model>format (e.g.vercel/anthropic/claude-opus-4.6,vercel/minimax/minimax-m2.7,vercel/zai/glm-5).Model transform
The
verceltransform infers the sub-provider from canonical model name prefixes (claude-→ anthropic,gpt-→ openai,gemini-→ google,grok-→ xai,minimax-→ minimax,kimi-→ moonshotai,glm-→ zai) and applies gateway-specific name normalization. Claude version dashes become dots via a generic regex (claude-(\w+)-(\d+)-(\d+)→claude-$1-$2.$3) so new claude models need zero changes. The gateway usesgemini-3.1-pro-previewbutgemini-3-flash(no-preview), which differs from the native Google API.Fallback chains
vercelis added to every fallback entry where the model exists on the gateway catalog. Onlyk2p5(kimi-for-coding specific) andbig-pickle(opencode specific) are excluded. The gateway is always last in the providers list, so native providers take priority.CLI installer
--vercel-ai-gatewayflag for non-TUI install"vercel/"in existing omo config)Tests
vercel/<sub-provider>/<model>format