Skip to content

fix(instrumentation-langchain): Fix langchain end span #3007

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

DRXD1000
Copy link

@DRXD1000 DRXD1000 commented Jun 11, 2025

  • I have added tests that cover my changes.
  • If adding a new instrumentation or changing an existing one, I've added screenshots from some observability platform showing the change.
  • PR name follows conventional commits format: feat(instrumentation): ... or fix(instrumentation): ....
  • (If applicable) I have updated the documentation accordingly.

Regarding #2790 i have experimented with this change. As far as i have seen it in my applications, this reduces the memory leak significantly, but does not fix it completely.

I would really much appreciate if someone could look at this change and check its usability.


Important

Fixes memory leak in TraceloopCallbackHandler by ensuring spans are deleted from self.spans after ending.

This description was created by Ellipsis for 0993186. You can customize this summary. It will automatically update as commits are pushed.

@CLAassistant
Copy link

CLAassistant commented Jun 11, 2025

CLA assistant check
All committers have signed the CLA.

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Changes requested ❌

Reviewed everything up to 0993186 in 1 minute and 32 seconds. Click for details.
  • Reviewed 18 lines of code in 1 files
  • Skipped 0 files when reviewing.
  • Skipped posting 1 draft comments. View those below.
  • Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/callback_handler.py:397
  • Draft comment:
    Consider using self.spans.pop(run_id, None) for removing the span associated with run_id. This approach is more concise and reduces the chance of errors.
  • Reason this comment was not posted:
    Marked as duplicate.

Workflow ID: wflow_RwfZg1GoT5gbkyj1

You can customize Ellipsis by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.

@dinmukhamedm
Copy link
Collaborator

Hey @DRXD1000 thanks for opening this! I'm happy to jump into the investigation on

but does not fix it completely.

Do you have any minimal repro application and setup for tracking memory consumption?

@DRXD1000
Copy link
Author

DRXD1000 commented Jun 11, 2025

@dinmukhamedm sadly not. I tested with a running application using the memray package locally and grafana kubernetes Dashboards. The application i tested with is a langgraph stategraph inside a fastapi application. Maybe someone from the mentioned issue can provide a minimal example.

What i can provide right now are screenshots of the memray graphs. The only thing i changed is the openllmetry package. Both runs have gone trough 120 requests to the fastapi service.

Openllmetry main branch without modification 512MB -> 543.3 MB

Main-Branch-Start
Main-Branch-End

Branch of this pull request 455.9MB -> 458.3MB

Fix-Branch-Start
Fix-Branch-End

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants