You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
llmobs: add custom instrumentation otel example (#32781)
* add custom section, not that we do not support openllmetry or openinference currently
* Yun comments
* edits
---------
Co-authored-by: cecilia saixue watt <[email protected]>
Copy file name to clipboardExpand all lines: content/en/llm_observability/instrumentation/otel_instrumentation.md
+103-3Lines changed: 103 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -64,11 +64,15 @@ To generate traces compatible with LLM Observability, do one of the following:
64
64
65
65
After your application starts sending data, the traces automatically appear in the [**LLM Observability Traces** page][3]. To search for your traces in the UI, use the `ml_app` attribute, which is automatically set to the value of your OpenTelemetry root span's `service` attribute.
66
66
67
-
**Note**: There may be a 3-5 minute delay between sending traces and seeing them appear on the LLM Observability Traces page.
67
+
<divclass="alert alert-danger">OpenInference and OpenLLMetry are not supported, as they have not been updated to support OpenTelemetry 1.37 semantic conventions for generative AI.</a></div>
68
68
69
-
### Example
69
+
**Note**: There may be a 3-5 minute delay between sending traces and seeing them appear on the LLM Observability Traces page. If you have APM enabled, traces appear immediately in the APM Traces page.
70
70
71
-
The following example demonstrates a complete application using strands-agents with the OpenTelemetry integration. This same approach works with any framework that supports OpenTelemetry version 1.37 semantic conventions for generative AI, or with custom instrumentation that emits the required `gen_ai.*` attributes.
71
+
### Examples
72
+
73
+
#### Using strands-agents
74
+
75
+
The following example demonstrates a complete application using strands-agents with the OpenTelemetry integration. This same approach works with any framework that supports OpenTelemetry version 1.37 semantic conventions for generative AI.
72
76
73
77
```python
74
78
from strands import Agent
@@ -101,6 +105,102 @@ if __name__ == "__main__":
101
105
print(f"Agent: {result}")
102
106
```
103
107
108
+
#### Custom OpenTelemetry instrumentation
109
+
110
+
The following example demonstrates how to instrument your LLM application using custom OpenTelemetry code. This approach gives you full control over the traces and spans emitted by your application.
111
+
112
+
```python
113
+
import os
114
+
import json
115
+
from opentelemetry import trace
116
+
from opentelemetry.sdk.trace import TracerProvider
117
+
from opentelemetry.sdk.trace.export import BatchSpanProcessor
118
+
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
119
+
from opentelemetry.sdk.resources import Resource, SERVICE_NAME
120
+
from openai import OpenAI
121
+
122
+
# Configure OpenTelemetry to send traces to Datadog
0 commit comments