Skip to content

Commit da40480

Browse files
Update README.md
1 parent 66424fe commit da40480

File tree

1 file changed

+24
-0
lines changed

1 file changed

+24
-0
lines changed

README.md

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -126,6 +126,30 @@ async def main() -> None:
126126
asyncio.run(main())
127127
```
128128

129+
## Streaming
130+
Support for streaming responses are available by Server Side Events (SSE) for Serverless Inference and Agents.
131+
```
132+
import os
133+
from gradientai import GradientAI
134+
135+
client = GradientAI(
136+
inference_key=os.environ.get("GRADIENTAI_INFERENCE_KEY")
137+
)
138+
139+
response = client.chat.completions.create(
140+
model="llama3.3-70b-instruct",
141+
messages=[{ "role": "user", "content": "Write a story about a brave squirrel."}],
142+
stream=True,
143+
)
144+
145+
for chunk in response:
146+
if len(chunk.choices) > 0:
147+
if chunk.choices[0].delta.content:
148+
print(chunk.choices[0].delta.content, end="", flush=True)
149+
150+
```
151+
152+
129153
## Using types
130154

131155
Nested request parameters are [TypedDicts](https://docs.python.org/3/library/typing.html#typing.TypedDict). Responses are [Pydantic models](https://docs.pydantic.dev) which also provide helper methods for things like:

0 commit comments

Comments
 (0)