Cribl LogStream – Docs

Cribl LogStream Documentation

Questions? We'd love to help you! Meet us in #Cribl Community Slack (sign up here)
Download entire manual as PDF - v2.4.4


Cribl LogStream supports receiving data from Prometheus.


Type: Pull | TLS Support: No | Event Breaker Support: No

Configuring Cribl LogStream to Receive Data from Prometheus

Select Data > Sources, then select Prometheus from the Data Sources page's tiles or left menu. Click Add New to open the Prometheus > New Source modal, which provides the following fields.

General Settings

Input ID: Enter a unique name to identify this Source definition.

Extra dimensions: Dimensions to include in events. By default, host and source are included.

Discovery type: Target discovery mechanism. Use Static (the default) to manually enter a list of targets. Select DNS or AWS EC2 options enable dynamic discovery of endpoints to scrape. Your selection determines which fields are displayed lower in this section:

  • Targets: Displayed for Discovery type: Static. List of Prometheus targets to pull metrics from, values can be in URL or host[:port] format. For example: http://localhost:9090/metrics, localhost:9090, or localhost. In the cases where just host[:port] are specified, the endpoint will resolve to 'http://host[:port]/metrics'.

  • DNS names: Displayed for Discovery type: DNS. Enter a list of DNS names to resolve.

  • Record type: Displayed for Discovery type: DNS. Select the DNS record type to resolve. Defaults to SRV (Service record). Other options are A or AAA record.

  • Region: Displayed for Discovery type: AWS EC2. Select the AWS region in which to discover the EC2 instances with metrics endpoints to scrape.

Poll interval: How often (in minutes) to scrape targets for metrics. Defaults to 15. This value must be an integer that divides evenly into 60 minutes.

Log level: Set the verbosity level to one of debug, info (the default), warn, or error.

Processing Settings

Fields (Metadata)

In this section, you can add fields/metadata to each event using Eval-like functionality.

Name: Field name.

Value: JavaScript expression to compute field's value (can be a constant).


In this section's Pipeline drop-down list, you can select a single existing Pipeline to process data from this input before the data is sent through the Routes.

Advanced Settings

Keep alive time (seconds): How often workers should check in with the scheduler to keep job subscription alive. Defaults to 60 seconds.

Worker timeout (periods) : How many Keep alive time periods before an inactive worker's job subscription will be revoked. Defaults to 3 periods.

Internal Fields

Cribl LogStream uses a set of internal fields to assist in handling of data. These "meta" fields are not part of an event, but they are accessible, and Functions can use them to make processing decisions.

Fields for this Source:

  • __source
  • __isBroken
  • __inputId
  • __final
  • __criblMetrics
  • __channel
  • __cloneCount

Updated 4 months ago


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.