Skip to content

Alui/traced#205

Merged
mikegros merged 5 commits intomainfrom
alui/traced
Mar 25, 2026
Merged

Alui/traced#205
mikegros merged 5 commits intomainfrom
alui/traced

Conversation

@luiarthur
Copy link
Copy Markdown
Collaborator

@luiarthur luiarthur commented Mar 18, 2026

@ndebard Let's chat about this draft regarding #201 on 3/19.

This draft PR implements TracedChatOpenAI and TracedChatOllama. Their function is to wrap .invoke (of ChatOpenAI and ChatOllama) in logic to save text to/from the LLM, and metadata from LLM calls if available. In particular, reasoning summaries are saved when present.

import tempfile
from pathlib import Path

from ursa.agents import ExecutionAgent
from ursa.util.traced import TracedChatOpenAI

# OpenAI model with reasoning abilities
llm = TracedChatOpenAI(
    model="gpt-5-nano", reasoning={"effort": "low", "summary": "auto"}
)

# NOTE: Ollama model with reasoning abilities
# from ursa.util.traced import TracedChatOpenAI
# llm = "ollama-nemotron-nano": TracedChatOllama(
#   model="nemotron-3-nano:4b", reasoning=True
# )

# NOTE: Ollama Model without reasoning abilities
# llm TracedChatOllama(model="nemotron-mini:4b")

executor = ExecutionAgent(llm=llm)
executor.invoke(
    "Write a python script to print the first 10 positive integer."
)
# Save messages to json. Omit indent arg for minified json
llm.save_messages(Path("messages.json"), indent=2)

@luiarthur
Copy link
Copy Markdown
Collaborator Author

luiarthur commented Mar 19, 2026

TODO

  • [ ] Add a planning agent demo
  • Add other LLM metadata to messages.json
    • context windows
    • reasoning level
    • other typical metadata that is easily obtained

@luiarthur luiarthur marked this pull request as ready for review March 23, 2026 17:31
@luiarthur
Copy link
Copy Markdown
Collaborator Author

luiarthur commented Mar 23, 2026

@mikegros Could you review this PR?

The only existing source code that was changed is src/ursa/workflows/plan_execute_workflow.py where I added a None default for workspace as it currently isn't used.

The other changes are new files. So this PR introduces new abilities to log messages and does not introduce breaking changes.

@mikegros
Copy link
Copy Markdown
Collaborator

I'll try to review before our meeting. Otherwise, later today.

"using Monte Carlo; use standard lib only."
"Plan at most two steps."
)
llm.save_messages(Path(f"messages.json"), indent=2)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this is connected to the LLM, not a specific agent, how are the messages sorted here? It would be nice to make it clear to a user in the documentation how that works. Maybe just with a small comment.

Copy link
Copy Markdown
Collaborator Author

@luiarthur luiarthur Mar 23, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"sorted" as in what order the messages appear?

parent = cast(BaseChatModel, super())

output = parent.invoke(input, config=config, **kwargs)
self._append_message(output)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This appends at each LLM call, which is really smooth, but I am a little worried that the message log could get messy in multiagent settings. Specifically, when we instantiate one LLM object and then hand it to multiple agents (or even agents that can be used as tools of other agents).

I think this will lead to conversation histories that are mixed into each other. It might be nice to have some meta_data tag for what agent did the call.

This might be a "future work" effort instead of something to merge in here, but I wanted to bring it up just in case.

Copy link
Copy Markdown
Collaborator Author

@luiarthur luiarthur Mar 23, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Implementing code to add agent info will indeed be a heavier lift and could easily break existing code. Could we revisit after this PR?

@mikegros mikegros merged commit 451af46 into main Mar 25, 2026
2 checks passed
@mikegros mikegros deleted the alui/traced branch March 25, 2026 18:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants