Type: Push | TLS Support: YES | Event Breaker Support: No
Select Data > Sources, then select Elasticsearch API from the Data Sources page's tiles or left menu. Click Add New to open the Elasticsearch API > New Source modal, which provides the fields outlined below.
LogStream ships with an Elasticsearch API Source preconfigured to listen on Port 9200. You can clone or directly modify this Source to further configure it, and then enable it.
Input ID: Enter a unique name to identify this Elasticsearch Source definition.
Address: Enter the hostname/IP on which to listen for Elasticsearch data. (E.g.,
Port: Enter the port number.
Auth tokens: Shared secrets to be provided by any client (Authorization: <token>). Click Generate to create a new secret. If empty, unauthenticated access will be permitted.
Elasticsearch API endpoint (for Bulk API): Absolute path on which to listen for Elasticsearch API requests. Defaults to
/. LogStream automatically appends
_bulk, so (e.g.)
/myPath/_bulk. Requests could then be made to either
/myPath/<myIndexName>/_bulk. Other entries are faked as success.
Enabled defaults to
No. When toggled to
Certificate name: Name of the predefined certificate.
Private key path: Path on server where to find the private key to use in PEM format. Path can reference $ENV_VARS.
Passphrase: Passphrase to use to decrypt private key.
Certificate path: Server path at which to find certificates (in PEM format) to use. Path can reference
CA certificate path: Server path at which to find CA certificates (in PEM format) to use. Path can reference
Authenticate client (mutual auth): Require clients to present their certificates. Used to perform mutual authentication using SSL certs. Defaults to
No. When toggled to
Validate client certs: Reject certificates that are not authorized by a CA in the CA certificate path, or by another trusted CA (e.g., the system's CA). Defaults to
Common name: Regex matching subject common names in peer certificates allowed to connect. Defaults to
.*. Matches on the substring after
CN=. As needed, escape regex tokens to match literal characters. E.g., to match the subject
CN=worker.cribl.local, you would enter:
Minimum TLS version: Optionally, select the minimum TLS version to accept from connections.
Maximum TLS version: Optionally, select the maximum TLS version to accept from connections.
In this section, you can add fields/metadata to each event using Eval-like functionality.
Name: Field name.
In this section's Pipeline drop-down list, you can select a single existing Pipeline to process data from this input before the data is sent through the Routes.
Max active requests: Maximum number of active requests allowed for this Source, per Worker Process. Defaults to
0 for unlimited.
The Elasticsearch API input normalizes the following fields:
_timeat millisecond resolution.
hostis set to
- Original object
hostis stored in
The Elasticsearch Destination does the reverse, and it also recognizes the presence of
Cribl LogStream uses a set of internal fields to assist in handling of data. These "meta" fields are not part of an event, but they are accessible, and Functions can use them to make processing decisions.
Fields for this Source:
To set up Filebeat to send data to LogStream, use its Elasticsearch output. If an Auth Token is configured here, add it in Filebeat configuration under
output.elasticsearch.headers, as in this example:
output.elasticsearch: # Array of hosts to connect to. hosts: ["http://<LOGSTREAM_HOST>:9200/elastic"] output.elasticsearch.headers: Authorization: "myToken42"
Updated about a month ago