Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't run o1-preivew and o1-mini models in the langsmith playground #1120

Open
ogamaniuk opened this issue Oct 24, 2024 · 2 comments
Open

Can't run o1-preivew and o1-mini models in the langsmith playground #1120

ogamaniuk opened this issue Oct 24, 2024 · 2 comments

Comments

@ogamaniuk
Copy link

Issue you'd like to raise.

image

Here's the error message:

BadRequestError('Error code: 400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Only the default (false) value is supported.", 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}}')Traceback (most recent call last):

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 2303, in _atransform_stream_with_config

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 3385, in _atransform

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5573, in atransform

File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 1473, in atransform

File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 494, in astream

File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 472, in astream

File "/usr/local/lib/python3.11/site-packages/langchain_openai/chat_models/base.py", line 2015, in _astream

File "/usr/local/lib/python3.11/site-packages/langchain_openai/chat_models/base.py", line 785, in _astream

File "/usr/local/lib/python3.11/site-packages/ddtrace/contrib/internal/openai/patch.py", line 289, in patched_endpoint
resp = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 1490, in create

File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1831, in post

File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1525, in request

File "/usr/local/lib/python3.11/site-packages/openai/_base_client.py", line 1626, in _request

openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'stream' does not support true with this model. Only the default (false) value is supported.", 'type': 'invalid_request_error', 'param': 'stream', 'code': 'unsupported_value'}}

Suggestion:

No response

@gautamm20
Copy link

This is a major annoyance. Can be fixed with the correct pydantic validation.

@alex00x0
Copy link

have u tried deactivating stream as shown here?
https://langchain-ai.github.io/langgraph/how-tos/disable-streaming/?h=o1#disabling-streaming

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants