-
Notifications
You must be signed in to change notification settings - Fork 750
fix(instrumentation-langchain): Fix langchain end span #3007
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Changes requested ❌
Reviewed everything up to 0993186 in 1 minute and 32 seconds. Click for details.
- Reviewed
18
lines of code in1
files - Skipped
0
files when reviewing. - Skipped posting
1
draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py:397
- Draft comment:
Consider usingself.spans.pop(run_id, None)
for removing the span associated with run_id. This approach is more concise and reduces the chance of errors. - Reason this comment was not posted:
Marked as duplicate.
Workflow ID: wflow_RwfZg1GoT5gbkyj1
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
...emetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py
Outdated
Show resolved
Hide resolved
Hey @DRXD1000 thanks for opening this! I'm happy to jump into the investigation on
Do you have any minimal repro application and setup for tracking memory consumption? |
@dinmukhamedm sadly not. I tested with a running application using the memray package locally and grafana kubernetes Dashboards. The application i tested with is a langgraph stategraph inside a fastapi application. Maybe someone from the mentioned issue can provide a minimal example. What i can provide right now are screenshots of the memray graphs. The only thing i changed is the openllmetry package. Both runs have gone trough 120 requests to the fastapi service. Openllmetry main branch without modification 512MB -> 543.3 MBBranch of this pull request 455.9MB -> 458.3MB |
feat(instrumentation): ...
orfix(instrumentation): ...
.Regarding #2790 i have experimented with this change. As far as i have seen it in my applications, this reduces the memory leak significantly, but does not fix it completely.
I would really much appreciate if someone could look at this change and check its usability.
Important
Fixes memory leak in
TraceloopCallbackHandler
by ensuring spans are deleted fromself.spans
after ending.TraceloopCallbackHandler
by deleting spans fromself.spans
after they are ended in_end_span()
.child_id
andrun_id
fromself.spans
if they exist.This description was created by
for 0993186. You can customize this summary. It will automatically update as commits are pushed.