You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Configure Fleet to send data from Elastic Agent to Logstash
12
11
* Create an Elastic Agent policy with the necessary integrations
13
12
* Configure Logstash to use the elastic_integration filter plugin
@@ -16,13 +15,12 @@ Process overview
16
15
Logstash elastic-integration Filter Plugin Guide
17
16
18
17
Overview
19
-
The purpose of this guide is to walk through the steps necessary to configure Logstash to transform events
20
-
collected by the Elastic Agent using our pre-built ingest node pipelines that normalize data to the Elastic
21
-
Common Schema. This is possible with a new beta feature in Logstash known as the elastic-integration
18
+
The purpose of this guide is to walk through the steps necessary to configure {ls} to transform events
19
+
collected by the Elastic Agent using our pre-built Elastic Integrations that normalize data to the Elastic Common Schema (ECS).
20
+
This is possible with a new beta feature in Logstash known as the elastic-integration
22
21
filter plugin.
23
-
Using this new plugin, Logstash reads certain field values generated by the Elastic Agent that tells Logstash to
24
-
fetch pipeline definitions from an Elasticsearch cluster which Logstash can then use to process events before
25
-
sending them to thier configured destinations.
22
+
Using this new plugin, Logstash reads certain field values generated by the Elastic Agent, and uses them to apply the transformations from Elastic Integrations so that it can further process events before
23
+
sending them to their configured destinations.
26
24
27
25
Prerequisites/Requirements
28
26
@@ -72,9 +70,8 @@ Figure 4: policy-output
72
70
. Click “Create agent policy” at the bottom of the flyout.
73
71
. The new policy should be listed on the Agent policies page now.
74
72
. Click on the policy name so that we can start configuring an integration.
75
-
. On the policy page, click “Add integration”. This will take you to the integrations browser, where you
76
-
can select an integration that will have data stream definitions (mappings, pipelines, etc.), dashboards,
77
-
and data normalization pipelines that convert the source data into Elastic Common Schema.
73
+
. On the policy page, click “Add integration”.
74
+
This will take you to the integrations browser, where you can select an integration that will have everything necessary to _integrate_ that data source with your other data in the Elastic stack.
78
75
79
76
Figure 5: add-integration-to-policy
80
77
In this example we will search for and select the Crowdstrike integration.
Every event sent from the Elastic Agent to Logstash contains specific meta-fields. Input event are expected
171
-
to have data_stream.type, data_stream.dataset, and data_stream.namespace . This tells Logstash which pipelines
172
-
to fetch from Elasticsearch to correctly process the event before sending that event to it’s destination output.
173
-
Logstash performs a check quickly and often to see if an integrations associated ingest pipeline has had updates
174
-
or changes so that events are processed with the most recent version of the ingest pipeline.
165
+
Every event sent from the Elastic Agent to Logstash contains specific meta-fields.
166
+
Input event are expected to have data_stream.type, data_stream.dataset, and data_stream.namespace.
167
+
Logstash uses this information and its connection to Elasticsearch to determine which Integrations to apply to the event before sending that event to its destination output.
168
+
Logstash frequently synchronizes with Elasticsearch to ensure it has the most recent versions of the enabled Integrations.
175
169
176
170
177
171
All processing occurs in Logstash.
178
172
179
173
180
-
The user or credentials specified in the elastic_integration plugin needs to have sufficient privileges to get
181
-
182
-
the appropriate monitoring, pipeline definitions, and index templates necessary to transform the events. Mini-
183
-
mum required privileges can be found here: https://www.elastic.co/guide/en/logstash/current/plugins-filters-
The user or credentials specified in the elastic_integration plugin needs to have sufficient privileges to get information about Elasticsearch and the Integrations that are enabled.
175
+
Minimum required privileges can be found here: https://www.elastic.co/guide/en/logstash/current/plugins-filters-elastic_integration.html#plugins-filters-elastic_integration-minimum_required_privileges.
Copy file name to clipboardExpand all lines: docs/static/ea-integrations.asciidoc
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -78,7 +78,7 @@ output { <3>
78
78
-----
79
79
80
80
<1> Use `filter-elastic_integration` as the first filter in your pipeline
81
-
<2> You can use additional filters as long as they follow `filter-elastic_integration`
81
+
<2> You can use additional filters as long as they follow `filter-elastic_integration`. They will have access to the event as-transformed by your enabled integrations.
82
82
<3> Sample config to output data to multiple destinations
0 commit comments