From ccbcd6f2790d45807158a834b70ba3b0619a8908 Mon Sep 17 00:00:00 2001 From: Aocheng Wang Date: Wed, 16 Jul 2025 11:17:05 +0800 Subject: [PATCH 1/3] feat: add all remaining sdk instructions --- doc/tracing.md | 154 +++++++++++++++++++++++++++++++++++++++++++++++-- 1 file changed, 150 insertions(+), 4 deletions(-) diff --git a/doc/tracing.md b/doc/tracing.md index a439168..16b9c4d 100644 --- a/doc/tracing.md +++ b/doc/tracing.md @@ -6,8 +6,8 @@ All frameworks or SDKs that support OTLP and follow [semantic conventions for ge | | Azure AI Inference | Azure AI Foundry Agents Service | Anthropic | Gemini | LangChain | OpenAI SDK | OpenAI Agents SDK | |---|---|---|---|---|---|---|---| -| **Python** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 | ✅ | ✅ ([LangSmith](https://github.com/langchain-ai/langsmith-sdk)) 1,2 | ✅ ([opentelemetry-python-contrib](https://github.com/open-telemetry/opentelemetry-python-contrib)) 1 | ✅ ([Logfire](https://github.com/pydantic/logfire)) 1,2 | -| **TS/JS** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2| ❌ |✅ ([traceloop](https://github.com/traceloop/openllmetry)) 1,2 |✅ ([traceloop](https://github.com/traceloop/openllmetry)) 1,2|❌| +| **Python** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 | ✅ | ✅ ([LangSmith](https://github.com/langchain-ai/langsmith-sdk))1,2 | ✅ ([opentelemetry-python-contrib](https://github.com/open-telemetry/opentelemetry-python-contrib))1 | ✅ ([Logfire](https://github.com/pydantic/logfire))1,2 | +| **TS/JS** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2| ❌ |✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 |✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2|❌| > 1. The SDKs in brackets are third-party SDKs to support OTLP instrumentation. They are used because the official SDKs don't support OTLP. > 2. These instrumentation SDKs don't strictly adhere to the OpenTelemetry semantic conventions for generative AI systems. @@ -66,7 +66,7 @@ AIInferenceInstrumentor().instrument(True)
-Azure AI Inference SDK - TypeScript / JavaScript +Azure AI Inference SDK - TypeScript/JavaScript **Installation:** ```bash @@ -141,7 +141,7 @@ AIAgentsInstrumentor().instrument(True)
-Azure AI Foundry Agent Service - TypeScript / JavaScript +Azure AI Foundry Agent Service - TypeScript/JavaScript **Installation:** ```bash @@ -180,6 +180,129 @@ registerInstrumentations({ ```
+
+Anthropic - Python + +**Installation:** +```bash +pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-anthropic +``` + +**Setup:** +```python +from opentelemetry import trace +from opentelemetry.sdk.resources import Resource +from opentelemetry.sdk.trace import TracerProvider +from opentelemetry.sdk.trace.export import BatchSpanProcessor +from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter + +resource = Resource(attributes={ + "service.name": "opentelemetry-instrumentation-anthropic-traceloop" +}) +provider = TracerProvider(resource=resource) +otlp_exporter = OTLPSpanExporter( + endpoint="http://localhost:4318/v1/traces", +) +processor = BatchSpanProcessor(otlp_exporter) +provider.add_span_processor(processor) +trace.set_tracer_provider(provider) + +from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor +AnthropicInstrumentor().instrument() +``` +
+ +
+Anthropic - TypeScript/JavaScript + +**Installation:** +```bash +npm install @traceloop/node-server-sdk +``` + +**Setup:** +```javascript +const { initialize } = require("@traceloop/node-server-sdk"); +const { trace } = require("@opentelemetry/api"); + +initialize({ + appName: "opentelemetry-instrumentation-anthropic-traceloop", + baseUrl: "http://localhost:4318", + disableBatch: true, +}); +``` +
+ +
+Google Gemini - Python + +**Installation:** +```bash +pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-google-genai +``` + +**Setup:** +```python +from opentelemetry import trace +from opentelemetry.sdk.resources import Resource +from opentelemetry.sdk.trace import TracerProvider +from opentelemetry.sdk.trace.export import BatchSpanProcessor +from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter + +resource = Resource(attributes={ + "service.name": "opentelemetry-instrumentation-google-genai" +}) +provider = TracerProvider(resource=resource) +otlp_exporter = OTLPSpanExporter( + endpoint="http://localhost:4318/v1/traces", +) +processor = BatchSpanProcessor(otlp_exporter) +provider.add_span_processor(processor) +trace.set_tracer_provider(provider) + +from opentelemetry.instrumentation.google_genai import GoogleGenAiSdkInstrumentor +GoogleGenAiSdkInstrumentor().instrument(enable_content_recording=True) +``` +
+ +
+LangChain - Python + +**Installation:** +```bash +pip install langsmith +``` + +**Setup:** +```python +import os +from opentelemetry import trace + +os.environ["LANGSMITH_OTEL_ENABLED"] = "true" +os.environ["LANGSMITH_TRACING"] = "true" +os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "http://localhost:4318" +``` +
+ +
+LangChain - TypeScript/JavaScript + +**Installation:** +```bash +npm install @traceloop/node-server-sdk +``` + +**Setup:** +```javascript +const { initialize } = require("@traceloop/node-server-sdk"); +initialize({ + appName: "opentelemetry-instrumentation-langchain-traceloop", + baseUrl: "http://localhost:4318", + disableBatch: true, +}); +``` +
+
OpenAI - Python @@ -233,6 +356,29 @@ initialize({ ```
+
+OpenAI Agents SDK - Python + +**Installation:** +```bash +pip install logfire +``` + +**Setup:** +```python +import logfire +import os + +os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = "http://localhost:4318/v1/traces" + +logfire.configure( + service_name="opentelemetry-instrumentation-openai-agents-logfire", + send_to_logfire=False, +) +logfire.instrument_openai_agents() +``` +
+ ## A Full Example Here's a complete working example using Azure AI Inference SDK with Python that demonstrates how to set up both the tracing provider and instrumentation. From 4d4f5e9051ecb24004410c94321cc5dfeb37fdf1 Mon Sep 17 00:00:00 2001 From: Aocheng Wang Date: Wed, 16 Jul 2025 11:26:44 +0800 Subject: [PATCH 2/3] fix: langchain package --- doc/tracing.md | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/doc/tracing.md b/doc/tracing.md index 16b9c4d..046bd98 100644 --- a/doc/tracing.md +++ b/doc/tracing.md @@ -270,14 +270,12 @@ GoogleGenAiSdkInstrumentor().instrument(enable_content_recording=True) **Installation:** ```bash -pip install langsmith +pip install langsmith[otel] ``` **Setup:** ```python import os -from opentelemetry import trace - os.environ["LANGSMITH_OTEL_ENABLED"] = "true" os.environ["LANGSMITH_TRACING"] = "true" os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "http://localhost:4318" From 173dba03080d0f215f47f4ac09ef8e9205131a90 Mon Sep 17 00:00:00 2001 From: Aocheng Wang Date: Wed, 16 Jul 2025 11:53:30 +0800 Subject: [PATCH 3/3] chore: add description for code changes --- doc/tracing.md | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/doc/tracing.md b/doc/tracing.md index 046bd98..1f6bc12 100644 --- a/doc/tracing.md +++ b/doc/tracing.md @@ -29,6 +29,11 @@ All frameworks or SDKs that support OTLP and follow [semantic conventions for ge ## Set up Instrumentation +Overall, the code changes focus on: + +- Instrumenting the LLM/agent application. +- Configuring the OTLP trace exporter to use the AITK local collector. +
Azure AI Inference SDK - Python