These docs are for Cribl Edge 4.0 and are no longer actively maintained.
See the latest version (4.13).
Serialize
Use the Serialize Function to serialize an event’s content into a predefined format.
Usage
Filter: Filter expression (JS) that selects data to feed through the Function. Defaults to true
, meaning it evaluates all events.
Description: Simple description of this Function. Defaults to empty.
Final: If toggled to Yes
, stops feeding data to the downstream Functions. Defaults to No
.
Type: Data output format. Defaults to CSV
.
Library: Browse Parser/Formatter library.
Fields to serialize: Required for CSV
, ELFF
, CLF
, and Delimited values
Types. (All other formats support wildcard field lists.)
Source field: Field containing the object to serialize. Leave blank to serialize top-level event fields.
Destination field: Field to serialize the data into. Defaults to _raw
.
Examples
Scenario A: JSON to CSV
Assume a simple event that looks like this: {"time":"2019-08-25T14:19:10.240Z","channel":"input","level":"info","message":"initializing input","type":"kafka"}
We want to serialize these fields: _time
, channel
, level
, and type
into a single string, in CSV format, stored in a new destination field called test
.
To properly extract the key-value pairs from this event structure, we’ll use a built-in Event Breaker:
- Copy the above sample event to your clipboard.
- In the Preview pane, select Paste a Sample, and paste in the sample event.
- Under Select Event Breaker, choose ndjson (newline-delimited JSON), and click Save as a Sample File.
Now you’re ready to configure the Serialize Function, using the settings below:
Type: CSV
Fields to Serialize: _time channel level type
Destination Field: test
Source Field: [leave empty]
Result: test: 1566742750.24,input,info,kafka
In the new test
field, you now see the time
, channel
, level
, and type
keys extracted as top-level fields.
Scenario B: CSV to JSON
Let’s assume that a merchant wants to extract a subset of each customer order, to aggregate anonymized order statistics across their customer base. The transaction data is originally in CSV format, but the statistical data must be in JSON.
Here’s a CSV header (which we don’t want to process), followed by a row that represents one order:
orderID,custName,street,city,state,zip
20200622102822,john smith,100 Main St.,Anytown,AK,99911
To convert to JSON, we’ll need to first parse each field from the CSV to a manipulable field in the Pipeline, which the Serialize Function will be able to reference. In this example, the new manipulable field is message
.
Use the Parser
Function:
Filter: true
Operation mode: Extract
Type: CSV
Source field: _raw
Destination field: message
List of fields: orderID custName street city state zip
Now use the Serialize Function:
Filter: true
Type: JSON
Fields to serialize: city state
Source field: message
Destination field: orderStats