Building Data Pipelines with Logstash

In the previous chapter, we understood the importance of Logstash in the log analysis process. We also covered its usage and its high-level architecture, and went through some commonly used plugins. One of the important processes of Logstash is converting unstructured log data into structured data, which helps us search for relevant information easily and also assists in analysis. Apart from parsing the log data to make it structured, it would also be helpful if we could enrich the log data during this process so that we can gain further insights into our logs. Logstash comes in handy for enriching our log data, too. In the previous chapter, we have also seen that Logstash can read from a wide range of inputs and that Logstash is a heavy process. Installing Logstash on the edge nodes of shipping logs might not always be feasible. Is there an alternative or lightweight agent that can be used to ship logs? Let's explore that in this chapter as well.

In this chapter, we will be covering the following topics:

  • Parsing and enriching logs using Logstash
  • The Elastic Beats platform
  • Installing and configuring Filebeats for shipping logs
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset