Instrument Anthropic applications with OpenTelemetry
Learn how to instrument Anthropic-based applications with OpenTelemetry and export large language model (LLM) telemetry via OTLP into Cribl Search. Once your data is in Cribl Search, you can explore and investigate it to troubleshoot issues, analyze model behavior, and monitor performance.
You’ll complete the following high-level steps:
- Prepare Cribl Search to receive OTLP data.
- Instrument your Anthropic application (Python) with OpenTelemetry.
- Explore LLM telemetry use cases in Cribl Search.
Telemetry Captured
Each Anthropic Messages API call produces a trace span with attributes such as:
| Category | Data | Example |
|---|---|---|
| Timing | Call duration. | 0.9s. |
| Model | Requested model. | claude-3-5-haiku-20241022, claude-3-5-sonnet-20241022. |
| Token usage | Input and output tokens. | 120 input, 64 output. |
| Request params | Max tokens, temperature, system prompt. | max_tokens=256. |
| Response | Stop reason, message id. | end_turn. |
| Errors | HTTP status, API error. | 401, 429, invalid model id. |
| Content | Prompt and completion text. | (opt-in, disabled by default). |
Semantic conventions follow OpenTelemetry GenAI Semantic Conventions where applicable.
Prerequisites
You’ll need:
- Cribl.Cloud Enterprise.
- Search Admin permission or higher.
- An Anthropic API key.
- Python 3.9+.
Get Data Into Cribl Search
Complete these steps before you instrument your LLM application:
- Add a lakehouse engine. This provides storage and compute for data you’re going to ingest into Cribl Search.
- Add a Cribl Search OpenTelemetry Source to start ingesting the data.
- Set up your Datatype rules to parse and structure the incoming data.
- Set up your Dataset rules to route ingested events into individual Search Datasets. This will let you scope your queries and control retention.
Add a Lakehouse Engine
A lakehouse engine is the storage-and-compute unit in Cribl Search that holds ingested OTLP (and other Source data) until Dataset retention expires. See Lakehouse Engines in Cribl Search to learn how to setup a new lakehouse engine.
Configure the OpenTelemetry (OTel) Source in Cribl Search
To receive OTLP from your LLM application directly in Cribl Search, add an OpenTelemetry Source.
On the Cribl.Cloud top bar, select Products > Search. Under Data, select Add Source > OpenTelemetry.
In the New Source modal, configure the following under General Settings:
- ID: Unique Source ID across your Cribl.Cloud Workspace. Use letters, numbers, underscores, and hyphens.
- Description (optional): Describe the Source (for example, OTLP from LLM-instrumented apps).
- Address: Hostname that your OpenTelemetry collector or agent connects to. You will use this in exporter configuration.
- Port: Network port to listen on (default
4317for gRPC). Change if you use a different port or protocol. - OTLP version: Version that matches your upstream sender (default
1.3.1). - Protocol: gRPC (default) or HTTP. This must match your OpenTelemetry exporter.
Under Authentication, choose None, Basic, Basic (Credentials Secret), Auth Tokens, or Auth Token (Text Secret) as required. See Set up Authentication.
Under Encrypt, enable TLS and set the minimum TLS version when senders must connect over TLS. See Set Up Encryption.
Select Save to create the Source.
Set Datatype Rules
Next, configure Datatype rules to parse, filter, and normalize your data into structured fields.
On the Cribl.Cloud top bar, select Products > Search > Data > Datatyping (auto). Here, you can:
- Use Auto-Datatyping to parse your data automatically.
- Check for uncategorized data that didn’t match any Datatype rules.
- Handle the uncategorized data by adding custom Datatype rules.
See also:
- Datatypes in Cribl Search
- v2 Datatypes in Cribl Search
- List of Stock v2 Datatypes
- Add a Custom v2 Datatype
Add a Dataset and Set Dataset Rules
Next, create a Dataset and add Dataset rules to route parsed events into it.
Add a Dataset
- On the Cribl.Cloud top bar, select Products > Search > Data > Datasets.
- Select Add Dataset.
- Enter a unique ID.
- Optionally add a Description and Tags.
- Set Dataset Provider set to lakehouse.
- Select the Lakehouse engine that will store the data.
- Set the Retention period.
- Select Save.
Set Dataset Rules
Dataset rules enable you to route ingested events into individual Search Datasets so you can scope your queries and control retention.
See Organize Your Data for details around how to configure your Dataset rules and plan your Search Datasets based on estimated future storage costs.
Instrument Your Application
Use the Python instructions below with OpenInference AnthropicInstrumentor after your Search OpenTelemetry Source, Datatype rules, and Dataset rules are in place.
There is no official OpenTelemetry instrumentation package for the Anthropic Node.js SDK.
Python
These steps use OpenInference AnthropicInstrumentor with Python 3.9+. Use automatic instrumentation for the quickest path, or code-based setup when you need explicit control over the tracer.
Auto-Instrumentation
- Install packages:
pip install opentelemetry-distro opentelemetry-exporter-otlp \
anthropic openinference-instrumentation-anthropic- Optionally, bootstrap additional instrumentation packages:
opentelemetry-bootstrap -a install- Run your application with the auto-instrumentor:
opentelemetry-instrument python app.pyThe instrumentor reads its configuration from OTEL_* environment variables.
Code-Based Setup
Register a tracer provider and AnthropicInstrumentor in a small bootstrap module, then import that module before you create an Anthropic client.
- Create a
tracing.pyfile:
# tracing.py: import this BEFORE your app code
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from openinference.instrumentation.anthropic import AnthropicInstrumentor
resource = Resource.create({"service.name": "my-anthropic-app"})
provider = TracerProvider(resource=resource)
provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter()))
trace.set_tracer_provider(provider)
AnthropicInstrumentor().instrument(tracer_provider=provider)- Import
tracingat the top of your application entry point before any Anthropic calls:
import tracing # must be first
import anthropic
client = anthropic.Anthropic()
message = client.messages.create(
model="claude-3-5-haiku-20241022",
max_tokens=256,
messages=[{"role": "user", "content": "Hello!"}],
)
print(message.content[0].text)Set Environment Variables
Set these environment variables before running your instrumented application:
Use gRPC (port 4317)
export OTEL_EXPORTER_OTLP_ENDPOINT="http://<cribl-host>:4317"
export OTEL_EXPORTER_OTLP_PROTOCOL="grpc"
export OTEL_SERVICE_NAME="my-anthropic-app"
# Only if you enabled auth on the Search OpenTelemetry Source:
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <token>"Use HTTP/protobuf (port 4318)
export OTEL_EXPORTER_OTLP_ENDPOINT="http://<cribl-host>:4318"
export OTEL_EXPORTER_OTLP_PROTOCOL="http/protobuf"
export OTEL_SERVICE_NAME="my-anthropic-app"
# Only if you enabled auth on the Search OpenTelemetry Source:
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <token>"Environment Variable Reference
| Variable | Required | Description |
|---|---|---|
OTEL_EXPORTER_OTLP_ENDPOINT | Yes | Cribl Search OpenTelemetry Source URL (http://<cribl-host>:4317 for gRPC, :4318 for HTTP). |
OTEL_EXPORTER_OTLP_PROTOCOL | Yes | grpc or http/protobuf. |
OTEL_SERVICE_NAME | Yes | Logical name for your application (appears in traces). |
OTEL_EXPORTER_OTLP_HEADERS | No | Auth header if the Source requires it (e.g., Authorization=Bearer <token>). |
ANTHROPIC_API_KEY | Yes | Your Anthropic API key (used by the anthropic SDK, not by OTel). |
Capture Prompt/Completion Content
Enabling content capture may expose sensitive data in your telemetry pipeline.
export OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT="true"Verify Data Flow in Cribl Search
After completing the instrumentation process, confirm OTLP events flow from your instrumented app into Cribl Search.
- On the Cribl.Cloud top bar, select Products > Search > Data > Live Data.
- Select your OpenTelemetry Source and confirm events appear while you trigger traffic from your Anthropic application.
- For full Live Data behavior and troubleshooting, see See Live Data Flow.
Common Search Issues and Fixes
| Symptom | Cause | Fix |
|---|---|---|
| No events returned. | OpenTelemetry Source is not receiving data, or Dataset rules are not routing data correctly. | Verify the Source address, port, protocol, and auth settings. Check Data > Live Data for the OpenTelemetry Source, then confirm your Dataset rule sends matching events to the correct Search Dataset. |
| Empty time range. | No recent traffic, or event timestamps are outside your selected window. | Widen Time range; generate a test Anthropic call from your instrumented app. Review timestamp parsing in your Datatype configuration. |
| Fields not where you expect. | Auto-Datatyping did not classify the events as expected, or your Datatype rules need refinement. | Review the data in Live Data, check for Uncategorized events, and update or add Datatype rules so the data parses into the fields you expect. |
| Slow or expensive searches. | Large partitions scanned. | Narrow time range, increase sampling, or pre-aggregate in Stream. |
| Permission errors. | Insufficient Search role. | Ask an Admin to grant Editor (or higher) for Dataset management, or User for running searches per org policy. |
| 401 from Anthropic. | Invalid or missing API key. | Verify ANTHROPIC_API_KEY in the environment where the app runs. |
LLM Telemetry Use Cases
With data flowing into Cribl Search, you can explore and gain visibility towards your Anthropic telemetry:
- Explore LLM telemetry: Explore your Search Dataset in Cribl Search and query it with KQL. Filter and slice by service, model, environment, token usage, cost signals, and error or status fields.
- Investigate incidents and performance regressions: During elevated errors or latency, investigate your LLM Search Dataset. Use aggregations such as
summarizeortimestatsto break down failures by model, deployment, feature, tenant, or region. Usejoinor related patterns with infrastructure, gateway, or security Datasets for end-to-end context. - Build dashboards for LLM usage and cost: Use Cribl Search Dashboards to track requests over time by model or app, token usage and derived cost, error rates by environment or tenant, and latency percentiles (for example P95/P99) plus token distributions and traffic share.
- Automate checks with scheduled searches and notifications: Turn important queries (for example, when estimated LLM cost or error rate crosses a threshold over a time window) into Scheduled Searches and attach Notifications to email, Slack, SNS, webhooks, and other targets.
Learn how to instrument Anthropic applications with OpenTelemetry, export large language model (LLM) telemetry via OTLP, and send it into Cribl Stream. Once data is in Cribl Stream, you can route, mask, and enrich it before it reaches your observability backends.
You’ll complete the following high-level steps:
- Configure an OpenTelemetry (OTel) Source in Cribl Stream to receive OTLP data.
- Instrument your Anthropic application (Python) with OpenTelemetry.
- Explore LLM telemetry use cases for Cribl Stream.
Telemetry Captured
Each Anthropic Messages API call produces a trace span with attributes such as:
| Category | Data | Example |
|---|---|---|
| Timing | Call duration. | 0.9s. |
| Model | Requested model. | claude-3-5-haiku-20241022, claude-3-5-sonnet-20241022. |
| Token usage | Input and output tokens. | 120 input, 64 output. |
| Request params | Max tokens, temperature, system prompt. | max_tokens=256. |
| Response | Stop reason, message id. | end_turn. |
| Errors | HTTP status, API error. | 401, 429, invalid model id. |
| Content | Prompt and completion text. | (opt-in, disabled by default). |
Semantic conventions follow OpenTelemetry GenAI Semantic Conventions where applicable.
Prerequisites
- A Cribl Stream instance (v4.x+) with an OpenTelemetry (OTel) Source enabled (gRPC 4317 or HTTP/protobuf 4318).
- An Anthropic API key – set as
ANTHROPIC_API_KEY. - Python 3.9+.
Configure the OpenTelemetry (OTel) Source in Cribl Stream
Before instrumenting your Anthropic app, configure Cribl Stream to receive OTLP data.
Configure an OTel Source
To receive OTLP from your LLM application, add or edit an OpenTelemetry Source on your Worker Group as follows.
On the top bar, select Products, and then select Cribl Stream. Under Worker Groups, select a Worker Group. Next, you have two options:
- To configure via QuickConnect, navigate to Routing > QuickConnect (Stream) or Collect (Edge). Select Add Source and select the Source you want from the list, choosing either Select Existing or Add New.
- To configure via the Routes, select Data > Sources (Stream) or More > Sources (Edge). Select the Source you want. Next, select Add Source.
In the New Source modal, configure the following under General Settings:
- Input ID: Unique ID for this Source. For example,
OTel042. - Description: Optionally, enter a description.
- OTLP version: The drop-down offers
0.10.0and1.3.1(default). - Protocol: Use the drop-down to choose the protocol matching the data you will ingest:
gRPC(default), orHTTP. - Address: Enter the hostname/IP to listen on. Defaults to
0.0.0.0(all addresses, IPv4 format). - Port: By default, OTel applications send output to port
4317when using the gRPC protocol, and port4318when using HTTP. This setting defaults to4317- you must change it if you set Protocol (below) toHTTP, or you want Cribl Stream to collect data from an OTel application that is using a different port.
Port
4318is not available on Cribl-managed Worker Groups in Cribl.Cloud.- The Extract spans, Extract metrics, and Extract logs settings are specific to the OpenTelemetry Source. By default, these options are toggled off, allowing Cribl Stream to act as a pass-through that generates a single event for each incoming OTel payload. This is useful when you want to forward complete OTel events to downstream systems, such as persistent storage, without breaking them apart. You can enable these settings to extract and process individual records from within OTel events:
- Extract spans: Generates an individual event for each span in a trace. Traces typically contain multiple spans.
- Extract metrics: Generates an individual event for each data point in a metric event. OTel metrics often contain multiple data points per event.
- Extract logs: Available only when OTLP version is set to
1.3.1. Generates an individual event for each log record. Cribl recommends enabling this option to simplify log transformation and manipulation.
- Tags: Optionally, add tags to help filter and group Sources within Cribl Stream’s UI. Tags are not included in the event data. Separate tag names using a tab or hard return.
- Input ID: Unique ID for this Source. For example,
Optionally, you can adjust the Authentication, TLS, Persistent Queue Settings, Processing, and Advanced settings, or Connected Destinations.
Select Save, then Commit & Deploy.
Instrument Your Application
Use the Python instructions below with OpenInference AnthropicInstrumentor after your OTel Source is in place.
There is no official OpenTelemetry instrumentation package for the Anthropic Node.js SDK.
Python
Use automatic instrumentation with opentelemetry-instrument for the least code churn, or code-based setup when you want explicit control over tracer initialization and import order.
Auto-Instrumentation
- Install the OpenTelemetry distro, OTLP exporter, Anthropic SDK, and OpenInference Anthropic instrumentation:
pip install opentelemetry-distro opentelemetry-exporter-otlp \
anthropic openinference-instrumentation-anthropic- Optionally, bootstrap additional instrumentation packages:
opentelemetry-bootstrap -a install- Write your application using the Anthropic client as usual:
import anthropic
client = anthropic.Anthropic()
message = client.messages.create(
model="claude-3-5-haiku-20241022",
max_tokens=256,
messages=[{"role": "user", "content": "What is OpenTelemetry in one sentence?"}],
)
print(message.content[0].text)- Run with the auto-instrumentor:
opentelemetry-instrument python app.pyThe instrumentor reads its configuration from OTEL_* environment variables.
Code-Based Setup
Register a tracer provider and AnthropicInstrumentor in a small bootstrap module, then import that module before you create an Anthropic client or call the API.
- Create a
tracing.pyfile:
# tracing.py: import this BEFORE your app code
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from openinference.instrumentation.anthropic import AnthropicInstrumentor
resource = Resource.create({"service.name": "my-anthropic-app"})
provider = TracerProvider(resource=resource)
provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter()))
trace.set_tracer_provider(provider)
AnthropicInstrumentor().instrument(tracer_provider=provider)- Import
tracingat the top of your application entry point before any Anthropic calls:
import tracing # must be first
import anthropic
client = anthropic.Anthropic()
message = client.messages.create(
model="claude-3-5-haiku-20241022",
max_tokens=256,
messages=[{"role": "user", "content": "Hello!"}],
)
print(message.content[0].text)Set Environment Variables
Set these environment variables before running your instrumented application:
Use gRPC (port 4317)
Set environment variables using gRPC. This is the recommended method:
export OTEL_EXPORTER_OTLP_ENDPOINT="http://<cribl-host>:4317"
export OTEL_EXPORTER_OTLP_PROTOCOL="grpc"
export OTEL_SERVICE_NAME="my-anthropic-app"
# Only if you enabled auth on the Cribl OTEL Source:
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <token>"Use HTTP/protobuf (port 4318)
If your OTLP exporter or network path requires HTTP/protobuf instead of gRPC, use port 4318 and set the protocol as shown.
export OTEL_EXPORTER_OTLP_ENDPOINT="http://<cribl-host>:4318"
export OTEL_EXPORTER_OTLP_PROTOCOL="http/protobuf"
export OTEL_SERVICE_NAME="my-anthropic-app"
# Only if you enabled auth on the Cribl OTEL Source:
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <token>"Environment Variable Reference
Use this table to set OTLP export options, optional auth headers for the Cribl Source, and the Anthropic API key for the anthropic SDK.
| Variable | Required | Description |
|---|---|---|
OTEL_EXPORTER_OTLP_ENDPOINT | Yes | Cribl Stream OTEL Source URL (http://<cribl-host>:4317 for gRPC, :4318 for HTTP). |
OTEL_EXPORTER_OTLP_PROTOCOL | Yes | grpc or http/protobuf. |
OTEL_SERVICE_NAME | Yes | Logical name for your application (appears in traces). |
OTEL_EXPORTER_OTLP_HEADERS | No | Auth header if Cribl Source requires it (e.g., Authorization=Bearer <token>). |
ANTHROPIC_API_KEY | Yes | Your Anthropic API key (used by the anthropic SDK, not by OTel). |
Capture Prompt/Completion Content
Enabling content capture may expose sensitive data in your telemetry pipeline. Use Cribl Stream’s masking and redaction functions to sanitize data before routing to downstream destinations.
By default, prompt and completion content is often not captured or is limited–check instrumentor behavior for your version. To align with typical GenAI instrumentation toggles:
export OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT="true"Verify Data Flow in Cribl Stream
After completing the instrumentation process, confirm OTLP events flow from your instrumented app into Cribl Stream.
- In Cribl Stream, go to Monitoring > Sources and select your OTel Source.
- Open the Live Data tab.
- Select Start Capture.
- Trigger a request in your Anthropic application.
- Confirm that events appear in Live Data for your OTel Source. Look for model id, usage metadata, and status. If content capture is enabled, ensure prompts appear only where policy allows and mask or redact in a Pipeline if needed.
Route Data to Cribl Search
After Anthropic OTLP data reaches Cribl Stream, add a Cribl Search Destination so you can explore and investigate LLM telemetry to troubleshoot issues, analyze model behavior, and monitor performance. See the Cribl Search Destination guide for setup instructions.
Common Issues and Fixes
If spans are missing or the exporter cannot reach Cribl Stream, use this table to narrow down endpoint, authentication, TLS, API keys, and instrumentation issues.
| Symptom | Cause | Fix |
|---|---|---|
| No data in Cribl Stream. | Wrong endpoint or port. | Verify OTEL_EXPORTER_OTLP_ENDPOINT matches the Cribl OTEL Source address and port. |
| Connection refused. | Source not running or firewall blocking. | Ensure the OTEL Source is enabled and the port is open. |
| 401 Unauthorized (Cribl). | Auth mismatch on OTel Source. | Check OTEL_EXPORTER_OTLP_HEADERS matches the auth config on the Cribl Source. |
| 401 from Anthropic. | Invalid or missing API key. | Verify ANTHROPIC_API_KEY. |
| Missing spans. | Instrumentation not loaded. | Use opentelemetry-instrument or ensure AnthropicInstrumentor().instrument() runs before any Anthropic client calls. |
| Streaming test or span issues. | SDK version. | Upgrade the anthropic package (pip install -U anthropic); streaming APIs are version-sensitive. |
| TLS errors. | TLS mismatch. | Use https:// in the endpoint if TLS is enabled on the Source, or disable TLS on the Source for local testing. |
For deeper exporter debugging:
OTEL_LOG_LEVEL=debug opentelemetry-instrument python app.pyLLM Telemetry Use Cases
With data flowing into Cribl Stream, you can route Anthropic telemetry to any supported Destination. For example:
- Cribl Search: Search and analyze full-fidelity LLM traces and logs interactively.
- Cribl Lake: Store and search traces natively with partitioning optimized for Cribl Search, so you can query Claude traces at scale without re-ingesting them into a separate system.
- Amazon S3 or other object storage: Archive traces for compliance or long-term analysis using partitioning schemes aligned with Cribl Search, so you can query data in place without moving it.
- OpenTelemetry: Forward LLM traces in standard OTLP format to downstream systems.
To set up routing:
- Add Routes in Routing > Data Routes to match OTel data and direct it to your chosen Destinations.
- Go to Processing > Pipelines and create a Pipeline to process data flowing down your Route.
- Use Functions to enrich, filter, sample, or redact data before delivery. For example:
- Mask prompt/completion content containing sensitive information.
- Sample high-volume traces to reduce cost.
- Aggregate token usage metrics by model or service.
For more detailed LLM telemetry use cases, see LLM Telemetry Use Cases in Cribl.