You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/oss/python/integrations/chat/azure_chat_openai.mdx
+8-4Lines changed: 8 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,11 +12,15 @@ You can find information about Azure OpenAI's latest models and their costs, con
12
12
Azure OpenAI refers to OpenAI models hosted on the [Microsoft Azure platform](https://azure.microsoft.com/en-us/products/ai-services/openai-service). OpenAI also provides its own model APIs. To access OpenAI services directly, use the [`ChatOpenAI` integration](/oss/integrations/chat/openai/).
13
13
</Info>
14
14
15
-
<Tip>
16
-
**API Reference**
15
+
<Info>
16
+
**Azure OpenAI v1 API**
17
+
18
+
Azure OpenAI's [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python#next-generation-api-1) (Generally Available as of August 2025) allows you to use `ChatOpenAI` directly with Azure endpoints. This provides a unified interface and native support for Microsoft Entra ID authentication with automatic token refresh.
17
19
18
-
For detailed documentation of all features and configuration options, head to the @[`AzureChatOpenAI`] API reference.
19
-
</Tip>
20
+
See the [ChatOpenAI Azure section](/oss/integrations/chat/openai#using-with-azure-openai) for details on using `ChatOpenAI` with Azure's v1 API.
21
+
22
+
`AzureChatOpenAI` is still currently supported for traditional Azure OpenAI API versions and scenarios requiring Azure-specific configurations, but we recommend using `ChatOpenAI` or the `AzureAIChatCompletionsModel` in [LangChain Azure AI](https://docs.langchain.com/oss/python/integrations/providers/azure_ai) going forward.
23
+
</Info>
20
24
21
25
<Note>
22
26
@[`AzureChatOpenAI`] shares the same underlying base implementation as @[`ChatOpenAI`],
Copy file name to clipboardExpand all lines: src/oss/python/integrations/chat/openai.mdx
+96-7Lines changed: 96 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,12 +18,6 @@ You can find information about OpenAI's latest models, their costs, context wind
18
18
`ChatOpenAI` is fully compatible with OpenAI's (legacy) [Chat Completions API](https://platform.openai.com/docs/guides/completions). If you are looking to connect to other model providers that support the Chat Completions API, you can do so – see [instructions](/oss/integrations/chat#chat-completions-api).
19
19
</Note>
20
20
21
-
<Info>
22
-
**OpenAI models hosted on Azure**
23
-
24
-
Note that certain OpenAI models can also be accessed via the [Microsoft Azure platform](https://azure.microsoft.com/en-us/products/ai-foundry/models/openai/). To use the Azure OpenAI service use the [`AzureChatOpenAI`](/oss/integrations/chat/azure_chat_openai/) integration.
25
-
</Info>
26
-
27
21
## Overview
28
22
29
23
### Integration details
@@ -87,7 +81,7 @@ llm = ChatOpenAI(
87
81
# timeout=None,
88
82
# reasoning_effort="low",
89
83
# max_retries=2,
90
-
# api_key="...", # if you prefer to pass api key in directly instaed of using env vars
84
+
# api_key="...", # if you prefer to pass api key in directly instead of using env vars
91
85
# base_url="...",
92
86
# organization="...",
93
87
# other params...
@@ -136,6 +130,101 @@ from langchain_openai import ChatOpenAI
As of `langchain-openai>=1.0.1`, `ChatOpenAI` can be used directly with Azure OpenAI endpoints using the new [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python#next-generation-api-1). This provides a unified way to use OpenAI models whether hosted on OpenAI or Azure.
139
+
140
+
For the traditional Azure-specific implementation, continue to use [`AzureChatOpenAI`](/oss/integrations/chat/azure_chat_openai/).
141
+
</Info>
142
+
143
+
<Accordion title="Using Azure OpenAI v1 API with API Key">
144
+
145
+
To use `ChatOpenAI` with Azure OpenAI, set the `base_url` to your Azure endpoint with `/openai/v1/` appended:
<Accordion title="Using Azure OpenAI with Microsoft Entra ID">
162
+
163
+
The v1 API adds native support for [Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/managed-identity) (formerly Azure AD) authentication with automatic token refresh. Pass a token provider callable to the `api_key` parameter:
164
+
165
+
```python
166
+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
167
+
from langchain_openai import ChatOpenAI
168
+
169
+
# Create a token provider that handles automatic refresh
api_key=token_provider # Callable that handles token refresh
179
+
)
180
+
181
+
# Use the model as normal
182
+
messages = [
183
+
("system", "You are a helpful assistant."),
184
+
("human", "Translate 'I love programming' to French.")
185
+
]
186
+
response = llm.invoke(messages)
187
+
print(response.content)
188
+
```
189
+
190
+
The token provider is a callable that automatically retrieves and refreshes authentication tokens, eliminating the need to manually manage token expiration.
191
+
192
+
<Tip>
193
+
**Installation requirements**
194
+
195
+
To use Microsoft Entra ID authentication, install the Azure Identity library:
196
+
197
+
```bash
198
+
pip install azure-identity
199
+
```
200
+
201
+
</Tip>
202
+
203
+
You can also pass a token provider callable to the `api_key` parameter when using
204
+
asynchronous functions. You must import DefaultAzureCredential from `azure.identity.aio`:
205
+
206
+
207
+
```python
208
+
from azure.identity.aio import DefaultAzureCredential
209
+
from langchain_openai import ChatOpenAI
210
+
211
+
credential = DefaultAzureCredential()
212
+
213
+
llm_async = ChatOpenAI(
214
+
model="gpt-5-nano",
215
+
api_key=credential
216
+
)
217
+
218
+
# Use async methods when using async callable
219
+
response = await llm_async.ainvoke("Hello!")
220
+
```
221
+
222
+
<Note>
223
+
When using an async callable for the API key, you must use async methods (`ainvoke`, `astream`, etc.). Sync methods will raise an error.
224
+
</Note>
225
+
226
+
</Accordion>
227
+
139
228
## Tool calling
140
229
141
230
OpenAI has a [tool calling](https://platform.openai.com/docs/guides/function-calling) (we use "tool calling" and "function calling" interchangeably here) API that lets you describe tools and their arguments, and have the model return a JSON object with a tool to invoke and the inputs to that tool. tool-calling is extremely useful for building tool-using chains and agents, and for getting structured outputs from models more generally.
Copy file name to clipboardExpand all lines: src/oss/python/integrations/text_embedding/openai.mdx
+95Lines changed: 95 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -57,6 +57,12 @@ embeddings = OpenAIEmbeddings(
57
57
)
58
58
```
59
59
60
+
<Info>
61
+
**Azure OpenAI v1 API support**
62
+
63
+
As of `langchain-openai>=1.0.1`, `OpenAIEmbeddings` can be used directly with Azure OpenAI endpoints using the new [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python#next-generation-api-1), including support for Microsoft Entra ID authentication. See the [Using with Azure OpenAI](#using-with-azure-openai) section below for details.
64
+
</Info>
65
+
60
66
## Indexing and Retrieval
61
67
62
68
Embedding models are often used in retrieval-augmented generation (RAG) flows, both as part of indexing data as well as later retrieving it. For more detailed instructions, please see our [RAG tutorials](/oss/langchain/rag).
As of `langchain-openai>=1.0.1`, `OpenAIEmbeddings` can be used directly with Azure OpenAI endpoints using the new [v1 API](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/api-version-lifecycle?tabs=python#next-generation-api-1). This provides a unified way to use OpenAI embeddings whether hosted on OpenAI or Azure.
140
+
141
+
For the traditional Azure-specific implementation, continue to use [`AzureOpenAIEmbeddings`](/oss/integrations/text_embedding/azure_openai).
142
+
</Info>
143
+
144
+
### Using Azure OpenAI v1 API with API Key
145
+
146
+
To use `OpenAIEmbeddings` with Azure OpenAI, set the `base_url` to your Azure endpoint with `/openai/v1/` appended:
147
+
148
+
```python
149
+
from langchain_openai import OpenAIEmbeddings
150
+
151
+
embeddings = OpenAIEmbeddings(
152
+
model="text-embedding-3-large", # Your Azure deployment name
The v1 API adds native support for [Microsoft Entra ID](https://learn.microsoft.com/en-us/azure/ai-foundry/openai/how-to/managed-identity) authentication with automatic token refresh. Pass a token provider callable to the `api_key` parameter:
164
+
165
+
```python
166
+
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
167
+
from langchain_openai import OpenAIEmbeddings
168
+
169
+
# Create a token provider that handles automatic refresh
170
+
token_provider = get_bearer_token_provider(
171
+
DefaultAzureCredential(),
172
+
"https://cognitiveservices.azure.com/.default"
173
+
)
174
+
175
+
embeddings = OpenAIEmbeddings(
176
+
model="text-embedding-3-large", # Your Azure deployment name
When using an async callable for the API key, you must use async methods (`aembed_query`, `aembed_documents`). Sync methods will raise an error.
221
+
</Note>
222
+
128
223
## API reference
129
224
130
225
For detailed documentation on `OpenAIEmbeddings` features and configuration options, please refer to the [API reference](https://python.langchain.com/api_reference/openai/embeddings/langchain_openai.embeddings.base.OpenAIEmbeddings.html).
0 commit comments