Skip to content

Commit 20b0078

Browse files
authored
docs: Adding in Docs for using Azure Default Credential Callable with OpenAI (#1160)
1 parent cd5e7f3 commit 20b0078

File tree

3 files changed

+199
-11
lines changed

3 files changed

+199
-11
lines changed

src/oss/python/integrations/chat/azure_chat_openai.mdx

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -12,11 +12,15 @@ You can find information about Azure OpenAI's latest models and their costs, con
1212
Azure OpenAI refers to OpenAI models hosted on the [Microsoft Azure platform](https://azure.microsoft.com/en-us/products/ai-services/openai-service). OpenAI also provides its own model APIs. To access OpenAI services directly, use the [`ChatOpenAI` integration](/oss/integrations/chat/openai/).
1313
</Info>
1414

15-
<Tip>
16-
**API Reference**
15+
<Info>
16+
**Azure OpenAI v1 API**
17+
18+
Azure OpenAI's [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python#next-generation-api-1) (Generally Available as of August 2025) allows you to use `ChatOpenAI` directly with Azure endpoints. This provides a unified interface and native support for Microsoft Entra ID authentication with automatic token refresh.
1719

18-
For detailed documentation of all features and configuration options, head to the @[`AzureChatOpenAI`] API reference.
19-
</Tip>
20+
See the [ChatOpenAI Azure section](/oss/integrations/chat/openai#using-with-azure-openai) for details on using `ChatOpenAI` with Azure's v1 API.
21+
22+
`AzureChatOpenAI` is still currently supported for traditional Azure OpenAI API versions and scenarios requiring Azure-specific configurations, but we recommend using `ChatOpenAI` or the `AzureAIChatCompletionsModel` in [LangChain Azure AI](https://docs.langchain.com/oss/python/integrations/providers/azure_ai) going forward.
23+
</Info>
2024

2125
<Note>
2226
@[`AzureChatOpenAI`] shares the same underlying base implementation as @[`ChatOpenAI`],

src/oss/python/integrations/chat/openai.mdx

Lines changed: 96 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -18,12 +18,6 @@ You can find information about OpenAI's latest models, their costs, context wind
1818
`ChatOpenAI` is fully compatible with OpenAI's (legacy) [Chat Completions API](https://platform.openai.com/docs/guides/completions). If you are looking to connect to other model providers that support the Chat Completions API, you can do so – see [instructions](/oss/integrations/chat#chat-completions-api).
1919
</Note>
2020

21-
<Info>
22-
**OpenAI models hosted on Azure**
23-
24-
Note that certain OpenAI models can also be accessed via the [Microsoft Azure platform](https://azure.microsoft.com/en-us/products/ai-foundry/models/openai/). To use the Azure OpenAI service use the [`AzureChatOpenAI`](/oss/integrations/chat/azure_chat_openai/) integration.
25-
</Info>
26-
2721
## Overview
2822

2923
### Integration details
@@ -87,7 +81,7 @@ llm = ChatOpenAI(
8781
# timeout=None,
8882
# reasoning_effort="low",
8983
# max_retries=2,
90-
# api_key="...", # if you prefer to pass api key in directly instaed of using env vars
84+
# api_key="...", # if you prefer to pass api key in directly instead of using env vars
9185
# base_url="...",
9286
# organization="...",
9387
# other params...
@@ -136,6 +130,101 @@ from langchain_openai import ChatOpenAI
136130
llm = ChatOpenAI(model="gpt-4.1-mini", stream_usage=True) # [!code highlight]
137131
```
138132

133+
## Using with Azure OpenAI
134+
135+
<Info>
136+
**Azure OpenAI v1 API support**
137+
138+
As of `langchain-openai>=1.0.1`, `ChatOpenAI` can be used directly with Azure OpenAI endpoints using the new [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python#next-generation-api-1). This provides a unified way to use OpenAI models whether hosted on OpenAI or Azure.
139+
140+
For the traditional Azure-specific implementation, continue to use [`AzureChatOpenAI`](/oss/integrations/chat/azure_chat_openai/).
141+
</Info>
142+
143+
<Accordion title="Using Azure OpenAI v1 API with API Key">
144+
145+
To use `ChatOpenAI` with Azure OpenAI, set the `base_url` to your Azure endpoint with `/openai/v1/` appended:
146+
147+
```python
148+
from langchain_openai import ChatOpenAI
149+
150+
llm = ChatOpenAI(
151+
model="gpt-5-mini", # Your Azure deployment name
152+
base_url="https://{your-resource-name}.openai.azure.com/openai/v1/",
153+
api_key="your-azure-api-key"
154+
)
155+
156+
response = llm.invoke("Hello, how are you?")
157+
print(response.content)
158+
```
159+
</Accordion>
160+
161+
<Accordion title="Using Azure OpenAI with Microsoft Entra ID">
162+
163+
The v1 API adds native support for [Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/managed-identity) (formerly Azure AD) authentication with automatic token refresh. Pass a token provider callable to the `api_key` parameter:
164+
165+
```python
166+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
167+
from langchain_openai import ChatOpenAI
168+
169+
# Create a token provider that handles automatic refresh
170+
token_provider = get_bearer_token_provider(
171+
DefaultAzureCredential(),
172+
"https://cognitiveservices.azure.com/.default"
173+
)
174+
175+
llm = ChatOpenAI(
176+
model="gpt-5-mini", # Your Azure deployment name
177+
base_url="https://{your-resource-name}.openai.azure.com/openai/v1/",
178+
api_key=token_provider # Callable that handles token refresh
179+
)
180+
181+
# Use the model as normal
182+
messages = [
183+
("system", "You are a helpful assistant."),
184+
("human", "Translate 'I love programming' to French.")
185+
]
186+
response = llm.invoke(messages)
187+
print(response.content)
188+
```
189+
190+
The token provider is a callable that automatically retrieves and refreshes authentication tokens, eliminating the need to manually manage token expiration.
191+
192+
<Tip>
193+
**Installation requirements**
194+
195+
To use Microsoft Entra ID authentication, install the Azure Identity library:
196+
197+
```bash
198+
pip install azure-identity
199+
```
200+
201+
</Tip>
202+
203+
You can also pass a token provider callable to the `api_key` parameter when using
204+
asynchronous functions. You must import DefaultAzureCredential from `azure.identity.aio`:
205+
206+
207+
```python
208+
from azure.identity.aio import DefaultAzureCredential
209+
from langchain_openai import ChatOpenAI
210+
211+
credential = DefaultAzureCredential()
212+
213+
llm_async = ChatOpenAI(
214+
model="gpt-5-nano",
215+
api_key=credential
216+
)
217+
218+
# Use async methods when using async callable
219+
response = await llm_async.ainvoke("Hello!")
220+
```
221+
222+
<Note>
223+
When using an async callable for the API key, you must use async methods (`ainvoke`, `astream`, etc.). Sync methods will raise an error.
224+
</Note>
225+
226+
</Accordion>
227+
139228
## Tool calling
140229

141230
OpenAI has a [tool calling](https://platform.openai.com/docs/guides/function-calling) (we use "tool calling" and "function calling" interchangeably here) API that lets you describe tools and their arguments, and have the model return a JSON object with a tool to invoke and the inputs to that tool. tool-calling is extremely useful for building tool-using chains and agents, and for getting structured outputs from models more generally.

src/oss/python/integrations/text_embedding/openai.mdx

Lines changed: 95 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -57,6 +57,12 @@ embeddings = OpenAIEmbeddings(
5757
)
5858
```
5959

60+
<Info>
61+
**Azure OpenAI v1 API support**
62+
63+
As of `langchain-openai>=1.0.1`, `OpenAIEmbeddings` can be used directly with Azure OpenAI endpoints using the new [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python#next-generation-api-1), including support for Microsoft Entra ID authentication. See the [Using with Azure OpenAI](#using-with-azure-openai) section below for details.
64+
</Info>
65+
6066
## Indexing and Retrieval
6167

6268
Embedding models are often used in retrieval-augmented generation (RAG) flows, both as part of indexing data as well as later retrieving it. For more detailed instructions, please see our [RAG tutorials](/oss/langchain/rag).
@@ -125,6 +131,95 @@ for vector in two_vectors:
125131
[-0.010181212797760963, 0.023419594392180443, -0.04215526953339577, -0.001532090245746076, -0.023573
126132
```
127133

134+
## Using with Azure OpenAI
135+
136+
<Info>
137+
**Azure OpenAI v1 API support**
138+
139+
As of `langchain-openai>=1.0.1`, `OpenAIEmbeddings` can be used directly with Azure OpenAI endpoints using the new [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python#next-generation-api-1). This provides a unified way to use OpenAI embeddings whether hosted on OpenAI or Azure.
140+
141+
For the traditional Azure-specific implementation, continue to use [`AzureOpenAIEmbeddings`](/oss/integrations/text_embedding/azure_openai).
142+
</Info>
143+
144+
### Using Azure OpenAI v1 API with API Key
145+
146+
To use `OpenAIEmbeddings` with Azure OpenAI, set the `base_url` to your Azure endpoint with `/openai/v1/` appended:
147+
148+
```python
149+
from langchain_openai import OpenAIEmbeddings
150+
151+
embeddings = OpenAIEmbeddings(
152+
model="text-embedding-3-large", # Your Azure deployment name
153+
base_url="https://{your-resource-name}.openai.azure.com/openai/v1/",
154+
api_key="your-azure-api-key"
155+
)
156+
157+
# Use as normal
158+
vector = embeddings.embed_query("Hello world")
159+
```
160+
161+
### Using Azure OpenAI with Microsoft Entra ID
162+
163+
The v1 API adds native support for [Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/managed-identity) authentication with automatic token refresh. Pass a token provider callable to the `api_key` parameter:
164+
165+
```python
166+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
167+
from langchain_openai import OpenAIEmbeddings
168+
169+
# Create a token provider that handles automatic refresh
170+
token_provider = get_bearer_token_provider(
171+
DefaultAzureCredential(),
172+
"https://cognitiveservices.azure.com/.default"
173+
)
174+
175+
embeddings = OpenAIEmbeddings(
176+
model="text-embedding-3-large", # Your Azure deployment name
177+
base_url="https://{your-resource-name}.openai.azure.com/openai/v1/",
178+
api_key=token_provider # Callable that handles token refresh
179+
)
180+
181+
# Use as normal
182+
vectors = embeddings.embed_documents(["Hello", "World"])
183+
```
184+
185+
<Tip>
186+
**Installation requirements**
187+
188+
To use Microsoft Entra ID authentication, install the Azure Identity library:
189+
190+
```bash
191+
pip install azure-identity
192+
```
193+
194+
</Tip>
195+
196+
You can also pass a token provider callable to the `api_key` parameter when using
197+
asynchronous functions. You must import DefaultAzureCredential from `azure.identity.aio`:
198+
199+
:::python
200+
201+
```python
202+
from azure.identity.aio import DefaultAzureCredential
203+
from langchain_openai import OpenAIEmbeddings
204+
205+
credential = DefaultAzureCredential()
206+
207+
embeddings_async = OpenAIEmbeddings(
208+
model="text-embedding-3-large",
209+
api_key=credential
210+
)
211+
212+
# Use async methods when using async callable
213+
vectors = await embeddings_async.aembed_documents(["Hello", "World"])
214+
215+
```
216+
217+
:::
218+
219+
<Note>
220+
When using an async callable for the API key, you must use async methods (`aembed_query`, `aembed_documents`). Sync methods will raise an error.
221+
</Note>
222+
128223
## API reference
129224

130225
For detailed documentation on `OpenAIEmbeddings` features and configuration options, please refer to the [API reference](https://python.langchain.com/api_reference/openai/embeddings/langchain_openai.embeddings.base.OpenAIEmbeddings.html).

0 commit comments

Comments
 (0)