Ingest Prometheus Metrics into Cribl Search
Collect metrics via the Prometheus Remote Write API to store them in Cribl Search for fast analysis.
Before You Begin
You’ll need:
- Cribl.Cloud Enterprise. For details, see Pricing.
- A lakehouse engine. See how to get one at lakehouse engines.
- Search Admin Permission, or higher. Learn who can do what at Cribl Search Permissions.
- A Prometheus client that can reach Cribl Search over HTTP(S).
You don’t need Cribl Stream, Edge, or Lake. (Looking for the Prometheus Remote Write Source in Cribl Stream instead?)
To query your Prometheus instance without moving data into Cribl Search, see Connect Cribl Search to Prometheus.
1. Add a Prometheus Remote Write Source in Cribl Search
On the Cribl.Cloud top bar, select Products > Search > Data > Add Source > Prometheus Remote Write.
Describe Your Source and Set the Endpoint
Under General, configure:
| Setting | Description | Example |
|---|---|---|
ID | Source ID, unique across your Cribl.Cloud Workspace. Use letters, numbers, underscores, hyphens. | prometheus_rw_prod |
| Description | Describe your Source so others know what it’s for. | Ingests Prometheus Remote Write metrics |
| Address | Hostname (FQDN) that your upstream sender connects to. You’ll need this to set up your upstream sender. | search.main.foo-bar-abc123.cribl.cloud |
| Port | Network port to listen on. Keep the default unless it conflicts with another service. | 20000 (default) |
| Remote Write API endpoint | Base path on which to listen for Prometheus Remote Write API requests. | /write (default) |
Set up Authentication
Use authentication to make sure only authorized senders can push data to your Cribl Search Source.
Under Authentication, select the Authentication type you want to use:
No authentication. Use only for testing or trusted internal networks.
Create a username and password. This is what your upstream sender will need to provide when sending data to your Source endpoint.
| Setting | Example |
|---|---|
| Username | prometheus_rw_user |
| Password | ******** |
Authenticate using a stored credentials secret instead of entering a username and password directly. This keeps credentials out of your Source configuration and makes them easier to rotate.
| Setting | Description | Example |
|---|---|---|
Credentials secret | Reference to a stored text secret that holds the credentials (username and password). Select a secret or Create a new one. (See Create and Manage Secrets in Cribl Stream). | sec_prometheus_rw_creds |
Create bearer tokens. This is what your upstream sender will need to provide in the authorization header.
Select
Add Token, then enter a token text or Generate a random one.
Authenticate using a stored token secret instead of entering a token text directly. This keeps tokens out of your Source configuration and makes them easier to rotate.
| Setting | Description | Example |
|---|---|---|
Token secret | Reference to a stored text secret that holds the token. Select a secret or Create a new one. (See Create and Manage Secrets in Cribl Stream). | sec_prometheus_rw_token |
Set Up Encryption
TLS encryption protects your data in transit between your upstream Prometheus client and the Cribl Search Source.
Under Encrypt, select Enabled, and set the Minimum TLS version you want to accept.
| TLS Version | When to Use |
|---|---|
| 1.3 | Recommended. Provides the best security. |
| 1.2 | Use only when connecting to older systems that don’t support TLS 1.3. |
| Older than 1.2 | Avoid if possible. These versions are no longer considered secure. |
Select Save to create the Source.
2. Set Datatype Rules
Configure Datatype rules to parse, filter, and normalize your data into structured fields. We call this process Datatyping.
On the Cribl.Cloud top bar, select Products > Search > Data > Datatyping (auto). Here, you can:
- Use Auto-Datatyping to parse your data automatically.
- Check for uncategorized data that didn’t match any Datatype rules.
- Handle the uncategorized data by adding custom Datatype rules.
See also:
- Datatypes in Cribl Search
- v2 Datatypes in Cribl Search
- List of Stock v2 Datatypes
- Add a Custom v2 Datatype
3. Set Dataset Rules
Configure Dataset rules to organize the parsed events into Datasets. This also determines how long the data is kept, as each Dataset has its own retention period.
On the Cribl.Cloud top bar, select Products > Search > Data > Datasets: Organize Your Data, and see Organize Your Data for details.
4. Set Up Your Prometheus Client
Configure your upstream Prometheus client to send data to your Cribl Search Source.
You’ll need these details from your Source configuration:
| Setting | Example |
|---|---|
| Address | search.main.foo-bar-abc123.cribl.cloud |
| Port | 20000 (default) |
| Remote Write API endpoint | /write (default) |
Example: Prometheus Remote Write > Cribl Search
Add a remote_write block to your prometheus.yml file, using the following example.
Replace the example address (search.main.foo-bar-abc123.cribl.cloud), username, password, endpoint, and port (if you
chose a different port) with your Source values.
remote_write:
- url: "https://search.main.foo-bar-abc123.cribl.cloud:20000/write"
basic_auth:
username: "your_username"
password: "********"Without TLS, use http instead of https.
Replace the example address (search.main.foo-bar-abc123.cribl.cloud), token, endpoint, and port (if you changed the
default 20000) with your Source values.
remote_write:
- url: "https://search.main.foo-bar-abc123.cribl.cloud:20000/write"
bearer_token: "420"Without TLS, use http instead of https.
5. See Live Data Flow
Verify that events are successfully flowing from your upstream sender into Cribl Search.
On the Cribl.Cloud top bar, select Products > Search > Data > Live Data.
Here, check for your Prometheus Remote Write Source. For details, see See Live Data Flow.
Next Steps
Now that your data is in Cribl Search, you can start using it. For example: