Skip to content

Commit f81c0c8

Browse files
committed
Incorporate pre-review comments
1 parent 55d5045 commit f81c0c8

File tree

2 files changed

+19
-28
lines changed

2 files changed

+19
-28
lines changed

docs/static/ea-integration-tutorial.asciidoc

Lines changed: 18 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,12 @@
11
[[ea-integrations-tutorial]]
2-
== Using {ls} with Elastic {integrations} (Beta) tutorial
3-
4-
5-
Logstash elastic-integration Filter
6-
Plugin Guide
7-
Ingest node pipelines in Logstash
2+
== Tutorial: {ls} `elastic_integration filter` to extend Elastic {integrations} (Beta)
3+
++++
4+
<titleabbrev>Tutorial: {ls} `elastic_integration filter`</titleabbrev>
5+
++++
86

97

108
Process overview
9+
1110
* Configure Fleet to send data from Elastic Agent to Logstash
1211
* Create an Elastic Agent policy with the necessary integrations
1312
* Configure Logstash to use the elastic_integration filter plugin
@@ -16,13 +15,12 @@ Process overview
1615
Logstash elastic-integration Filter Plugin Guide
1716

1817
Overview
19-
The purpose of this guide is to walk through the steps necessary to configure Logstash to transform events
20-
collected by the Elastic Agent using our pre-built ingest node pipelines that normalize data to the Elastic
21-
Common Schema. This is possible with a new beta feature in Logstash known as the elastic-integration
18+
The purpose of this guide is to walk through the steps necessary to configure {ls} to transform events
19+
collected by the Elastic Agent using our pre-built Elastic Integrations that normalize data to the Elastic Common Schema (ECS).
20+
This is possible with a new beta feature in Logstash known as the elastic-integration
2221
filter plugin.
23-
Using this new plugin, Logstash reads certain field values generated by the Elastic Agent that tells Logstash to
24-
fetch pipeline definitions from an Elasticsearch cluster which Logstash can then use to process events before
25-
sending them to thier configured destinations.
22+
Using this new plugin, Logstash reads certain field values generated by the Elastic Agent, and uses them to apply the transformations from Elastic Integrations so that it can further process events before
23+
sending them to their configured destinations.
2624

2725
Prerequisites/Requirements
2826

@@ -72,9 +70,8 @@ Figure 4: policy-output
7270
. Click “Create agent policy” at the bottom of the flyout.
7371
. The new policy should be listed on the Agent policies page now.
7472
. Click on the policy name so that we can start configuring an integration.
75-
. On the policy page, click “Add integration”. This will take you to the integrations browser, where you
76-
can select an integration that will have data stream definitions (mappings, pipelines, etc.), dashboards,
77-
and data normalization pipelines that convert the source data into Elastic Common Schema.
73+
. On the policy page, click “Add integration”.
74+
This will take you to the integrations browser, where you can select an integration that will have everything necessary to _integrate_ that data source with your other data in the Elastic stack.
7875

7976
Figure 5: add-integration-to-policy
8077
In this example we will search for and select the Crowdstrike integration.
@@ -118,7 +115,6 @@ filter {
118115
elastic_integration {
119116
hosts => "{es-host}:9200"
120117
ssl_enabled => true
121-
ssl_verification_mode => "certificate"
122118
ssl_certificate_authorities => ["/usr/share/logstash/config/certs/ca-cert.pem"]
123119
auth_basic_username => "elastic"
124120
auth_basic_password => "changeme"
@@ -136,7 +132,6 @@ output {
136132
password => "changeme"
137133
user => "elastic"
138134
cacert => "/usr/share/logstash/config/certs/ca-cert.pem"
139-
ssl_certificate_verification => false
140135
}
141136
}
142137
-----
@@ -167,19 +162,15 @@ output {
167162
}
168163
-----
169164

170-
Every event sent from the Elastic Agent to Logstash contains specific meta-fields. Input event are expected
171-
to have data_stream.type, data_stream.dataset, and data_stream.namespace . This tells Logstash which pipelines
172-
to fetch from Elasticsearch to correctly process the event before sending that event to it’s destination output.
173-
Logstash performs a check quickly and often to see if an integrations associated ingest pipeline has had updates
174-
or changes so that events are processed with the most recent version of the ingest pipeline.
165+
Every event sent from the Elastic Agent to Logstash contains specific meta-fields.
166+
Input event are expected to have data_stream.type, data_stream.dataset, and data_stream.namespace.
167+
Logstash uses this information and its connection to Elasticsearch to determine which Integrations to apply to the event before sending that event to its destination output.
168+
Logstash frequently synchronizes with Elasticsearch to ensure it has the most recent versions of the enabled Integrations.
175169

176170

177171
All processing occurs in Logstash.
178172

179173

180-
The user or credentials specified in the elastic_integration plugin needs to have sufficient privileges to get
181-
182-
the appropriate monitoring, pipeline definitions, and index templates necessary to transform the events. Mini-
183-
mum required privileges can be found here: https://www.elastic.co/guide/en/logstash/current/plugins-filters-
184-
elastic_integration.html#plugins-filters-elastic_integration-minimum_required_privileges
174+
The user or credentials specified in the elastic_integration plugin needs to have sufficient privileges to get information about Elasticsearch and the Integrations that are enabled.
175+
Minimum required privileges can be found here: https://www.elastic.co/guide/en/logstash/current/plugins-filters-elastic_integration.html#plugins-filters-elastic_integration-minimum_required_privileges.
185176

docs/static/ea-integrations.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -78,7 +78,7 @@ output { <3>
7878
-----
7979

8080
<1> Use `filter-elastic_integration` as the first filter in your pipeline
81-
<2> You can use additional filters as long as they follow `filter-elastic_integration`
81+
<2> You can use additional filters as long as they follow `filter-elastic_integration`. They will have access to the event as-transformed by your enabled integrations.
8282
<3> Sample config to output data to multiple destinations
8383

8484

0 commit comments

Comments
 (0)