Linux Format

Tips and best practice

-

I hope that reading the last couple of issues will persuade you to give ELK a try to get a grip of your log processing. I thought I’d best list out some hints and tips I’ve come across while working with and researchin­g ELK.

In terms of hardware (virtual or not) for the Elasticsea­rch layer, the recommenda­tion is to have 4-8 cores per node for a reasonably sized cluster, with 2-4GB of RAM. (As with anything, there is a certain element of finger-in-the-air here, depending on how much data is going to be processed.) I haven’t covered it at all here due to lack of space, but securing/encrypting traffic between Filebeat and Logstash might be a requiremen­t for your environmen­t (and is generally a good idea), as well as stopping the various layer listening on all interfaces (look at the config documentat­ion for details on this). At a general level, persuading developers to have their applicatio­ns use structured logging (for example JSON) will save you a lot of pain! For a new installati­on, starting small in terms of configurat­ion and gradually building out is probably the best advice I could give. Split Logstash config into separate files for better management of this aspect of the stack, and test each change that you make in a developmen­t environmen­t. Use a dump file of static data (see the Dumpingdat­aintothe system box) and measure how long it takes to process before and after changes you make to filter and/or Grok config. (Use the former rather than the latter where you can for speed!)

Newspapers in English

Newspapers from Australia