Merge pull request #242 from microsoft/aochengwang/tracing

feat: add all remaining sdk instructions
This commit is contained in:
Alex Wang 2025-07-16 15:30:13 +08:00 committed by GitHub
commit c2d1ec927d
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
1 changed files with 153 additions and 4 deletions

View File

@ -29,6 +29,11 @@ All frameworks or SDKs that support OTLP and follow [semantic conventions for ge
## Set up Instrumentation
Overall, the code changes focus on:
- Instrumenting the LLM/agent application.
- Configuring the OTLP trace exporter to use the AITK local collector.
<details>
<summary>Azure AI Inference SDK - Python</summary>
@ -180,6 +185,127 @@ registerInstrumentations({
```
</details>
<details>
<summary>Anthropic - Python</summary>
**Installation:**
```bash
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-anthropic
```
**Setup:**
```python
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
resource = Resource(attributes={
"service.name": "opentelemetry-instrumentation-anthropic-traceloop"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
endpoint="http://localhost:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)
from opentelemetry.instrumentation.anthropic import AnthropicInstrumentor
AnthropicInstrumentor().instrument()
```
</details>
<details>
<summary>Anthropic - TypeScript/JavaScript</summary>
**Installation:**
```bash
npm install @traceloop/node-server-sdk
```
**Setup:**
```javascript
const { initialize } = require("@traceloop/node-server-sdk");
const { trace } = require("@opentelemetry/api");
initialize({
appName: "opentelemetry-instrumentation-anthropic-traceloop",
baseUrl: "http://localhost:4318",
disableBatch: true,
});
```
</details>
<details>
<summary>Google Gemini - Python</summary>
**Installation:**
```bash
pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http opentelemetry-instrumentation-google-genai
```
**Setup:**
```python
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
resource = Resource(attributes={
"service.name": "opentelemetry-instrumentation-google-genai"
})
provider = TracerProvider(resource=resource)
otlp_exporter = OTLPSpanExporter(
endpoint="http://localhost:4318/v1/traces",
)
processor = BatchSpanProcessor(otlp_exporter)
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)
from opentelemetry.instrumentation.google_genai import GoogleGenAiSdkInstrumentor
GoogleGenAiSdkInstrumentor().instrument(enable_content_recording=True)
```
</details>
<details>
<summary>LangChain - Python</summary>
**Installation:**
```bash
pip install langsmith[otel]
```
**Setup:**
```python
import os
os.environ["LANGSMITH_OTEL_ENABLED"] = "true"
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = "http://localhost:4318"
```
</details>
<details>
<summary>LangChain - TypeScript/JavaScript</summary>
**Installation:**
```bash
npm install @traceloop/node-server-sdk
```
**Setup:**
```javascript
const { initialize } = require("@traceloop/node-server-sdk");
initialize({
appName: "opentelemetry-instrumentation-langchain-traceloop",
baseUrl: "http://localhost:4318",
disableBatch: true,
});
```
</details>
<details>
<summary>OpenAI - Python</summary>
@ -233,6 +359,29 @@ initialize({
```
</details>
<details>
<summary>OpenAI Agents SDK - Python</summary>
**Installation:**
```bash
pip install logfire
```
**Setup:**
```python
import logfire
import os
os.environ["OTEL_EXPORTER_OTLP_TRACES_ENDPOINT"] = "http://localhost:4318/v1/traces"
logfire.configure(
service_name="opentelemetry-instrumentation-openai-agents-logfire",
send_to_logfire=False,
)
logfire.instrument_openai_agents()
```
</details>
## A Full Example
Here's a complete working example using Azure AI Inference SDK with Python that demonstrates how to set up both the tracing provider and instrumentation.