Logstash

Logstash needs to be installed on the server from where the logs need to be collected and are shipped across to Elasticsearch to create indexes.

Once you have installed Logstash, it is recommended to configure your logstash.conf file, which is located at /etc/logstash, with details such as Logstash log's file rotation (that is /var/log/logstash/*.stdout*.err, or *.log) or a suffix format, such as data format. The following code block is a template for your reference:

    # see "man logrotate" for details 
  
    # number of backlogs to keep 
    rotate 7 
  
    # create new (empty) log files after rotating old ones 
    create 
  
    # Define suffix format 
    dateformat -%Y%m%d-%s 
  
    # use date as a suffix of the rotated file 
    dateext 
  
   # uncomment this if you want your log files compressed 
   compress 
  
   # rotate if bigger that size 
   size 100M 
  
   # rotate logstash logs 
   /var/log/logstash/*.stdout 
   /var/log/logstash/*.err 
   /var/log/logstash/*.log { 
       rotate 7 
       size 100M 
       copytruncate 
       compress 
       delaycompress 
       missingok 
       notifempty 
    } 

In order to ship your logs to Elasticsearch, you require three sections in the configuration, named INPUT, OUTPUT, and FILTER, which helps them create indexes. These sections can either be in a single file or in separate files.

The Logstash events processing pipeline works as an INPUT-FILTER-OUTPUT section, and, each section has its own advantages and usages, some of which are as follows:

  • Inputs: This event is needed to get the data from logs files. Some of the common inputs are file, which reads file with tailf; Syslog, which reads from the Syslogs service listening on port 514; beats, which collects events from Filebeat, and so on.
  • Filters: These middle tier devices in Logstash perform certain actions on the data based on the defined filters and separate data that meets the criteria. Some of them are GROK (structure and parse text based on the defined patter), clone (copycat the events by adding or removing fields), and so on.
  • Outputs: This is the final phase where we pass on the filtered data to defined output. There could be multiple output locations where we can pass the data for further indexing. Some of the commonly used outputs are Elasticsearch, which is very reliable; an easier, convenient platform to save your data, and it is much easier to query on it; and graphite, which is an open source tool for storing and shows data in the form of graphs.

The following are the examples of logs configuration for Syslog:

  • Input section for Syslog is written as follows:
   input { 
     file { 
     type => "syslog" 
    path => [ "/var/log/messages" ] 
    } 
   }
  • Filter section for Syslog is written like this:
   filter { 
     grok { 
      match => { "message" => "%{COMBINEDAPACHELOG}" } 
     } 
    date { 
     match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ] 
    } 
  } 
  • Output section for Syslog is written as follows:
   output { 
     elasticsearch { 
       protocol => "http" 
       host => "es.appliedcode.in" 
       port => "443" 
       ssl => "true" 
       ssl_certificate_verification => "false" 
       index => "syslog-%{+YYYY.MM.dd}" 
       flush_size => 100 
      } 
   } 

Configuration files to ship logs are usually stored in /etc/logstash/confd/.

If you are making separate files for each section, then there is a convention for naming files that needs to be followed; for example, an input file should be named 10-syslog-input.conf and a filter file should be named 20-syslog-filter.conf. Similarly, for output, it will be 30-syslog-output.conf.

In case you want to validate whether your configurations are correct or not, you can do so by executing the following command:

 $ sudo service logstash configtest

For more information on the Logstash configuration, refer to the documentation examples at https://www.elastic.co/guide/en/logstash/current/config-examples.html.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.227.79.52