diff --git a/doc/tracing.md b/doc/tracing.md
index a439168..1f6bc12 100644
--- a/doc/tracing.md
+++ b/doc/tracing.md
@@ -6,8 +6,8 @@ All frameworks or SDKs that support OTLP and follow [semantic conventions for ge
| | Azure AI Inference | Azure AI Foundry Agents Service | Anthropic | Gemini | LangChain | OpenAI SDK | OpenAI Agents SDK |
|---|---|---|---|---|---|---|---|
-| **Python** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 | ✅ | ✅ ([LangSmith](https://github.com/langchain-ai/langsmith-sdk)) 1,2 | ✅ ([opentelemetry-python-contrib](https://github.com/open-telemetry/opentelemetry-python-contrib)) 1 | ✅ ([Logfire](https://github.com/pydantic/logfire)) 1,2 |
-| **TS/JS** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2| ❌ |✅ ([traceloop](https://github.com/traceloop/openllmetry)) 1,2 |✅ ([traceloop](https://github.com/traceloop/openllmetry)) 1,2|❌|
+| **Python** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 | ✅ | ✅ ([LangSmith](https://github.com/langchain-ai/langsmith-sdk))1,2 | ✅ ([opentelemetry-python-contrib](https://github.com/open-telemetry/opentelemetry-python-contrib))1 | ✅ ([Logfire](https://github.com/pydantic/logfire))1,2 |
+| **TS/JS** | ✅ | ✅ | ✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2| ❌ |✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2 |✅ ([traceloop](https://github.com/traceloop/openllmetry))1,2|❌|
> 1. The SDKs in brackets are third-party SDKs to support OTLP instrumentation. They are used because the official SDKs don't support OTLP.
> 2. These instrumentation SDKs don't strictly adhere to the OpenTelemetry semantic conventions for generative AI systems.
@@ -29,6 +29,11 @@ All frameworks or SDKs that support OTLP and follow [semantic conventions for ge
## Set up Instrumentation
+Overall, the code changes focus on:
+
+- Instrumenting the LLM/agent application.
+- Configuring the OTLP trace exporter to use the AITK local collector.
+
Azure AI Inference SDK - Python
@@ -66,7 +71,7 @@ AIInferenceInstrumentor().instrument(True)
-Azure AI Inference SDK - TypeScript / JavaScript
+Azure AI Inference SDK - TypeScript/JavaScript
**Installation:**
```bash
@@ -141,7 +146,7 @@ AIAgentsInstrumentor().instrument(True)
-Azure AI Foundry Agent Service - TypeScript / JavaScript
+Azure AI Foundry Agent Service - TypeScript/JavaScript
**Installation:**
```bash
@@ -180,6 +185,127 @@ registerInstrumentations({
```
+
+Anthropic - Python
+
+**Installation:**
+```bash
+pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-anthropic
+```
+
+**Setup:**
+```python
+from opentelemetry import trace
+from opentelemetry.sdk.resources import Resource
+from opentelemetry.sdk.trace import TracerProvider
+from opentelemetry.sdk.trace.export import BatchSpanProcessor
+from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
+
+resource = Resource(attributes={
+ "service.name": "opentelemetry-instrumentation-anthropic-traceloop"
+})
+provider = TracerProvider(resource=resource)
+otlp_exporter = OTLPSpanExporter(
+ endpoint="http://localhost:4318/v1/traces",
+)
+processor = BatchSpanProcessor(otlp_exporter)
+provider.add_span_processor(processor)
+trace.set_tracer_provider(provider)
+
+from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
+AnthropicInstrumentor().instrument()
+```
+
+
+
+Anthropic - TypeScript/JavaScript
+
+**Installation:**
+```bash
+npm install @traceloop/node-server-sdk
+```
+
+**Setup:**
+```javascript
+const { initialize } = require("@traceloop/node-server-sdk");
+const { trace } = require("@opentelemetry/api");
+
+initialize({
+ appName: "opentelemetry-instrumentation-anthropic-traceloop",
+ baseUrl: "http://localhost:4318",
+ disableBatch: true,
+});
+```
+
+
+
+Google Gemini - Python
+
+**Installation:**
+```bash
+pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-google-genai
+```
+
+**Setup:**
+```python
+from opentelemetry import trace
+from opentelemetry.sdk.resources import Resource
+from opentelemetry.sdk.trace import TracerProvider
+from opentelemetry.sdk.trace.export import BatchSpanProcessor
+from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
+
+resource = Resource(attributes={
+ "service.name": "opentelemetry-instrumentation-google-genai"
+})
+provider = TracerProvider(resource=resource)
+otlp_exporter = OTLPSpanExporter(
+ endpoint="http://localhost:4318/v1/traces",
+)
+processor = BatchSpanProcessor(otlp_exporter)
+provider.add_span_processor(processor)
+trace.set_tracer_provider(provider)
+
+from opentelemetry.instrumentation.google_genai import GoogleGenAiSdkInstrumentor
+GoogleGenAiSdkInstrumentor().instrument(enable_content_recording=True)
+```
+
+
+
+LangChain - Python
+
+**Installation:**
+```bash
+pip install langsmith[otel]
+```
+
+**Setup:**
+```python
+import os
+os.environ["LANGSMITH_OTEL_ENABLED"] = "true"
+os.environ["LANGSMITH_TRACING"] = "true"
+os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "http://localhost:4318"
+```
+
+
+
+LangChain - TypeScript/JavaScript
+
+**Installation:**
+```bash
+npm install @traceloop/node-server-sdk
+```
+
+**Setup:**
+```javascript
+const { initialize } = require("@traceloop/node-server-sdk");
+initialize({
+ appName: "opentelemetry-instrumentation-langchain-traceloop",
+ baseUrl: "http://localhost:4318",
+ disableBatch: true,
+});
+```
+
+
OpenAI - Python
@@ -233,6 +359,29 @@ initialize({
```
+
+OpenAI Agents SDK - Python
+
+**Installation:**
+```bash
+pip install logfire
+```
+
+**Setup:**
+```python
+import logfire
+import os
+
+os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = "http://localhost:4318/v1/traces"
+
+logfire.configure(
+ service_name="opentelemetry-instrumentation-openai-agents-logfire",
+ send_to_logfire=False,
+)
+logfire.instrument_openai_agents()
+```
+
+
## A Full Example
Here's a complete working example using Azure AI Inference SDK with Python that demonstrates how to set up both the tracing provider and instrumentation.