Store the resulting documents in Elasticsearch

We use the Elasticsearch output plugin that comes with Logstash to send data to Elasticsearch. The usage is very simple; we just need to have elasticsearch under the output tag, as follows:

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "sensor_data-%{+YYYY.MM.dd}"
}
}

We have specified hosts and index to send the data to the right index within the right cluster. Notice that the index name has %{YYYY.MM.dd}. This calculates the index name to be used by using the event's current time and formats the time in this format.

Remember that we had defined an index template with the index pattern sensor_data*. When the first event is sent on May 26, 2019, the output plugin defined here will send the event to index sensor_data-2019.05.26.

If you want to send events to a secured Elasticsearch cluster as we did when we used X-Pack in Chapter 8, Elastic X-Pack, you can configure the user and password parameters as follows:

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "sensor_data-%{+YYYY.MM.dd}"
user => "elastic"
password => "elastic"
}
}

This way, we will have one index for every day, where each day's data will be stored within its index. We had learned the index per time frame in Chapter 9, Running the Elastic Stack in Production.

Now that we have our Logstash data pipeline ready, let's send some data.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.103.234