Skip to content

Conversation

@chenmoneygithub
Copy link
Collaborator

The reasoning models' response contains the reasoning content, and this PR introduce dspy.Reasoning so that the module output captures the native reasoning, instead of letting LM regenerate one.

Usage:

import dspy

dspy.configure(lm=dspy.LM("anthropic/claude-3-7-sonnet-20250219", cache=False))

class MySignature(dspy.Signature):
    question: str = dspy.InputField()
    reasoning: dspy.Reasoning = dspy.OutputField()
    answer: str = dspy.OutputField()


print(dspy.Predict(MySignature)(question="why did a chicken cross the kitchen?"))

For non-reasoning model, the above code will treat dspy.Reasoning field as a string field, which prompts the model to explicitly generate the reasoning.

There is a caveat of GPT-5 family models though - when using litellm chat completion + GPT-5 family models, the response doesn't contain the reasoning content. So temporarily we are treating GPT-5 family models as non-reasoning models.

is_last_chunk=self.stream_end,
)

try:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: maybe add comment why this logic should come after native response handling?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good call, done!

origin = get_origin(annotation)
args = get_args(annotation)
if origin is None:
if annotation is Reasoning:
Copy link
Collaborator

@TomeHirata TomeHirata Oct 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any way to implement the conversion more generically? Ideally this information should reside in Reasoning.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that's a good question.

I did think about the same thing, but changing the __name__ in dspy.Reasoning could lead to confusion, because essentially it's just a type, but from the perspective of DSPy Adapter, it is treated as string. So I kept the logic inside adapter utils.

field_type = field_info.annotation

if get_dspy_field_type(field_info) == "input" or field_type is str:
if get_dspy_field_type(field_info) == "input" or field_type is str or field_type is Reasoning:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same as above, let me know your thought!

return signature

if "gpt-5" in lm.model and lm.model_type == "chat":
# There is a caveat of Litellm as 1.79.0 that when using the chat completion API on GPT-5 family models,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any github issue for this on LiteLLM?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good call! added

lm: "LM",
lm_kwargs: dict[str, Any],
) -> type["Signature"]:
reasoning_effort = "unspecified"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is unspecified a valid reasoning effort?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the code was a bit misleading, refactored

The reasoning content (str) if available, None otherwise.
"""
try:
if hasattr(chunk, "choices") and chunk.choices:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit:

Suggested change
if hasattr(chunk, "choices") and chunk.choices:
if choices := getattr(chunk, "choices"):

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants