Skip to content

Commit c5fee97

Browse files
docs: add OpenAI responses api (#15868)
* docs: add tip openai page * added responses api --------- Co-authored-by: mubashir1osmani <[email protected]>
1 parent 09c1ad1 commit c5fee97

File tree

1 file changed

+115
-0
lines changed

1 file changed

+115
-0
lines changed

docs/my-website/src/pages/index.md

Lines changed: 115 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -214,6 +214,92 @@ response = completion(
214214

215215
</Tabs>
216216

217+
### Responses API
218+
219+
Use `litellm.responses()` for advanced models that support reasoning content like GPT-5, o3, etc.
220+
221+
<Tabs>
222+
<TabItem value="openai-responses" label="OpenAI">
223+
224+
```python
225+
from litellm import responses
226+
import os
227+
228+
## set ENV variables
229+
os.environ["OPENAI_API_KEY"] = "your-api-key"
230+
231+
response = responses(
232+
model="gpt-5-mini",
233+
messages=[{ "content": "What is the capital of France?","role": "user"}],
234+
reasoning_effort="medium"
235+
)
236+
237+
print(response)
238+
print(response.choices[0].message.content) # response
239+
print(response.choices[0].message.reasoning_content) # reasoning
240+
241+
```
242+
243+
</TabItem>
244+
<TabItem value="anthropic-responses" label="Anthropic (Claude)">
245+
246+
```python
247+
from litellm import responses
248+
import os
249+
250+
## set ENV variables
251+
os.environ["ANTHROPIC_API_KEY"] = "your-api-key"
252+
253+
response = responses(
254+
model="claude-3.5-sonnet",
255+
messages=[{ "content": "What is the capital of France?","role": "user"}]
256+
)
257+
```
258+
259+
</TabItem>
260+
261+
<TabItem value="vertex-responses" label="VertexAI">
262+
263+
```python
264+
from litellm import responses
265+
import os
266+
267+
# auth: run 'gcloud auth application-default'
268+
os.environ["VERTEX_PROJECT"] = "jr-smith-386718"
269+
os.environ["VERTEX_LOCATION"] = "us-central1"
270+
271+
response = responses(
272+
model="chat-bison",
273+
messages=[{ "content": "What is the capital of France?","role": "user"}]
274+
)
275+
```
276+
277+
</TabItem>
278+
279+
<TabItem value="azure-responses" label="Azure OpenAI">
280+
281+
```python
282+
from litellm import responses
283+
import os
284+
285+
## set ENV variables
286+
os.environ["AZURE_API_KEY"] = ""
287+
os.environ["AZURE_API_BASE"] = ""
288+
os.environ["AZURE_API_VERSION"] = ""
289+
290+
# azure call
291+
response = responses(
292+
"azure/<your_deployment_name>",
293+
messages = [{ "content": "What is the capital of France?","role": "user"}]
294+
)
295+
296+
print(response)
297+
```
298+
299+
</TabItem>
300+
301+
</Tabs>
302+
217303
### Streaming
218304
Set `stream=True` in the `completion` args.
219305

@@ -504,6 +590,10 @@ model_list:
504590
api_base: os.environ/AZURE_API_BASE # runs os.getenv("AZURE_API_BASE")
505591
api_key: os.environ/AZURE_API_KEY # runs os.getenv("AZURE_API_KEY")
506592
api_version: "2023-07-01-preview"
593+
594+
litellm_settings:
595+
master_key: sk-1234
596+
database_url: postgres://
507597
```
508598
509599
### Step 2. RUN Docker Image
@@ -524,6 +614,9 @@ docker run \
524614

525615
#### Step 2: Make ChatCompletions Request to Proxy
526616

617+
<Tabs>
618+
<TabItem value="chat-completions" label="Chat Completions">
619+
527620
```python
528621
import openai # openai v1.0.0+
529622
client = openai.OpenAI(api_key="anything",base_url="http://0.0.0.0:4000") # set proxy to base_url
@@ -538,6 +631,28 @@ response = client.chat.completions.create(model="gpt-3.5-turbo", messages = [
538631
print(response)
539632
```
540633

634+
</TabItem>
635+
<TabItem value="responses-api" label="Responses API">
636+
637+
```python
638+
from openai import OpenAI
639+
640+
client = OpenAI(
641+
api_key="sk-1234",
642+
base_url="http://0.0.0.0:4000"
643+
)
644+
645+
response = client.responses.create(
646+
model="gpt-5",
647+
input="Tell me a three sentence bedtime story about a unicorn."
648+
)
649+
650+
print(response)
651+
```
652+
653+
</TabItem>
654+
</Tabs>
655+
541656
## More details
542657

543658
- [exception mapping](../../docs/exception_mapping)

0 commit comments

Comments
 (0)