Conversation
|
Hi @3ba2ii - thanks for the PR. How does Azure-hosted OpenAI work vs. OpenAI? Is it just a different provider of OpenAI? The reason I ask is we're trying to limit the amount of providers, boilerplate, etc. that we're adding to the codebase |
|
Thanks @virattt, This adds support for Azure AI foundry endpoint, it just enables calling models through Azure’s API surface (different auth and endpoint semantics) with support to many other models by different providers (OpenAI, Anthropic, Meta, etc.). For example, the following screenshot is running Kimi-K2.5 model through Azure's abstraction layer, So the main goal of this PR is to add support to multiple different models through a single abstraction layer and allow users with active azure subscriptions to utilize the tool. You can find the model catalog here: https://ai.azure.com/catalog/models The confusion may come from the fact that Azure Foundry endpoints can also be accessed via the regular OpenAI SDK, I can update the PR title and description to list that this is using adding integration with Azure AI foundry layer. |
|
Hi @virattt, Did you get a chance to review the PR? Happy to hear your comments, Thanks |

Summary
This PR adds support to Azure AI Foundry models which provide access to 11333 models from different vendors (OpenAI, Anthropic, Meta, etc.).
What changed
azureprovider registration and routing.azure:<deployment-name>).AZURE_OPENAI_API_KEYAZURE_OPENAI_ENDPOINT/modelflow so Azure uses custom deployment-name input (similar to OpenRouter custom input).azure:prefixes render cleanly.env.example,README.md) with Azure setup examples.Validation
bun run typecheckstill reports an unrelated pre-existing error in Exa search typing.Why
This enables users running Azure-hosted OpenAI deployments to use Dexter without code changes, while preserving existing provider behavior.