Cribl LogStream – Docs

Cribl LogStream Documentation

Questions? We'd love to help you! Meet us in #Cribl Community Slack (sign up here)
Download entire manual as PDF – v.3.1.1


Cribl LogStream can send data to various Destinations, including Splunk, Kafka, Kinesis, InfluxDB, Snowflake, Databricks, TCP JSON, and many others.

Streaming Destinations

Destinations that accept events in real time are referred to as streaming Destinations:

Non-Streaming Destinations

Destinations that accept events in groups or batches are referred to as non-streaming Destinations:


The S3 Compatible Stores Destination can be adapted to send data to downstream services like Databricks and Snowflake, for which LogStream currently has no preconfigured Destination. For details, please contact Cribl Support.

Other Destinations

LogStream also provides these special-purpose Destinations:

  • Default: Here, you can specify a default output from among your configured Destinations.
  • Output Router: Flexible "meta-destination." Here, you can configure rules that route data to multiple configured Destinations.
  • DevNull: An output that simply drops events. Preconfigured and active when you install LogStream, so it requires no configuration. Useful for testing.
  • SpaceOut: This experimental Destination is undocumented. Be careful!

How Does Non-Streaming Delivery Work

Cribl LogStream uses a staging directory in the local filesystem to format and write outputted events before sending them to configured Destinations. After a set of conditions is met – typically file size and number of files, further details below – data is compressed and then moved to the final Destination.

An inventory of open, or in-progress, files is kept in the staging directory's root, to avoid having to walk that directory at startup. This can get expensive if staging is also the final directory. At startup, Cribl LogStream will check for any leftover files in progress from prior sessions, and will ensure that they're moved to their final Destination. The process of moving to the final Destination is delayed after startup (default delay: 30 seconds). Processing of these files is paced at one file per service period (which defaults to 1 second).

Batching Conditions

Several conditions govern when files are closed and rolled out:

  1. File reaches its configured maximum size.

  2. File reaches its configured maximum open time.

  3. File reaches its configured maximum idle time.

If a new file needs to be open, Cribl LogStream will enforce the maximum number of open files, by closing files in the order in which they were opened.

Data Delivery

Data is delivered to all Destinations on an at-least-once basis. When a Destination is unreachable, there are three possible behaviors:

  • Block - Cribl LogStream will block incoming events.
  • Drop - Cribl LogStream will drop events addressed to that Destination.
  • Queue - Cribl LogStream will Persistent-Queue events to that Destination.

You can configure the desired behavior through a Destination's Backpressure Behavior option. If this option is not present, Cribl LogStream's default behavior is to Block.

Configuring Destinations

For each Destination type, you can create multiple definitions, depending on your requirements.

To configure Destinations, select Destinations from LogStream's global top nav (single-instance deployments), or from a Worker Group's top nav (distributed deployments). On the resulting Data Destinations page's tiles or left menu, select the desired type, then click + Add New.

Capturing Outgoing Data

To capture data from a single enabled Destination, you can do so directly from the Destinations UI instead of using the Preview pane. To initiate an immediate capture, click the Live button on the Destination's's configuration row.

Destination > Live buttonDestination > Live button

Destination > Live button

You can also start an immediate capture from within an enabled Destination's configuration modal, by clicking the modal's Live Data tab.

Destination modal > Live Data tabDestination modal > Live Data tab

Destination modal > Live Data tab

Updated about a month ago


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.