As we describe features and concepts it helps to have a mental model of Cribl LogStream as a system that receives events from various sources, processes them, and then sends them to one ore more destinations.
At its core a function is a piece of code that executes on an event and it encapsulates the smallest amount of processing that can happen to that event. For instance, a simple function can be one that replaces the term
bar on each event. Another one can hash or encrypt
bar and yet another can add a field, say,
dc=jfk-42 to any event with
source=*us-nyc-application.log. Functions process each event that passes thru them. To help improve performance, functions can be optionally configured with filters to limit processing scope on matching events only. More details on functions.
A series of functions is called a pipeline and the order in which they are executed matters. Events are delivered at the beginning of a pipeline (by a Route, see below) and as they're processed by a function they are passed to the next one down the line. Events only move forward, towards the end of the pipeline and eventually out of the system. More details on pipelines.
Routes evaluate incoming events against filter expressions to find the appropriate pipeline to send them to. Routes are evaluated in order. A Route can be associated only with one pipeline and one output. By default, a Route-Pipeline-Output tuple will consume matching events. If the
Final flag is disabled, one or more clones are sent down the pipeline while the original event continues down the rest of the routes. This is very useful in cases where the same set of events needs to be processed differently and delivered to different destinations. More details on routes.
Updated 18 days ago