First-class support for OpenAI-compatible (OpenAPI-compatible) LLM providers in TanStack AI #151
RadheyMugdal
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi 👋
TanStack AI is great,
One limitation I ran into: TanStack AI currently requires provider-specific adapters, which makes it hard to use OpenAI-compatible APIs such as OpenRouter, Together, Groq, Fireworks, or self-hosted gateways.
It would be great to have first-class support for OpenAI-compatible providers, for example:
or a generic openaiCompatible() adapter.
Why this matters
Thanks for considering this
Beta Was this translation helpful? Give feedback.
All reactions