When configuring BYOK mode with engine: copilot to route inference to a custom model deployment on Azure AI Foundry,
the agent step fails immediately with:
Model '<model-name>' not found on provider at http://172.30.0.30:10002 (HTTP 404)
This happens regardless of whether COPILOT_MODEL is set to the generic model family name or the deployment name.
From the logs, the harness fetches a fixed list of ~25 models from the proxy (api-proxy:10002/models). These appear
to be the standard GitHub Copilot models. The external BYOK provider's deployment never appears to be registered or
discoverable in that list.
Expected behaviour: In BYOK mode, inference requests for a model that exists on the external provider should be
routed there — either by merging the external provider's model list into the proxy catalog, or by accepting any
COPILOT_MODEL value when COPILOT_PROVIDER_BASE_URL is set.
Config used:
engine:
id: copilot
env:
COPILOT_PROVIDER_BASE_URL: ${{ secrets.COPILOT_PROVIDER_BASE_URL }}
COPILOT_MODEL: gpt-5-mini-mycustom (also tried gpt-5-mini, same result)
COPILOT_PROVIDER_API_KEY: ${{ secrets.COPILOT_PROVIDER_API_KEY }}
COPILOT_PROVIDER_TYPE: azure
COPILOT_PROVIDER_WIRE_API: responses
COPILOT_PROVIDER_MODEL_ID: gpt-5-mini-mycustom
I followed the BYOK documentation as closely as I could, so it's entirely possible I'm misconfiguring something (first time working with Foundry too)
Thanks in advance
When configuring BYOK mode with engine: copilot to route inference to a custom model deployment on Azure AI Foundry,
the agent step fails immediately with:
Model '<model-name>' not found on provider at http://172.30.0.30:10002 (HTTP 404)This happens regardless of whether COPILOT_MODEL is set to the generic model family name or the deployment name.
From the logs, the harness fetches a fixed list of ~25 models from the proxy (api-proxy:10002/models). These appear
to be the standard GitHub Copilot models. The external BYOK provider's deployment never appears to be registered or
discoverable in that list.
Expected behaviour: In BYOK mode, inference requests for a model that exists on the external provider should be
routed there — either by merging the external provider's model list into the proxy catalog, or by accepting any
COPILOT_MODEL value when COPILOT_PROVIDER_BASE_URL is set.
Config used:
I followed the BYOK documentation as closely as I could, so it's entirely possible I'm misconfiguring something (first time working with Foundry too)
Thanks in advance