Analyzing & monitoring application logs using ELK stack

One of major issues faced by modern day enterprise applications are the management and monitoring of application logs. During the phase of development, the application logging is overlooked as a rather unimportant part. But once the application is live in production, we realize that the first level of understanding to  what is happening in the system is generated logs. With the modern day architecture , the components are independent and they all have their own logging. It becomes very cumbersome to have all these checked and analyzed when the system is very huge. This is where the ‘ELK’ stack comes to the rescue.

What is ELK ??

ELK is a software stack that can help us combine all the logs from different systems and then analyze , monitor and evaluate in a single dashboard. The ‘ELK’ is an acronym made with the first letters of the components in it. Those are ‘Elasticsearch’,’Logstash’ and Kibana.

The functions of each component can be summarized in the following way:

  • Elasticsearch : The core component that stores the logs as schema-free JSON documents which can be queried using http calls. This provides very fast realtime query capabilities which are absolute for analyzing the huge amount of logs .
  • Logstash : Logstash acts as the lumberjack. This will read the logs from the logs files, do a regex to extract the fields and then store in the elasticsearch.
  • Kibana : This is the UI component that provides the capability to create dashboards and query the stored data in elasticsearch.

We are now going to see how these can be setup to load the logs from a Java based application that is generating logs using the logback library.

My Application Setup

Spring java based application

  • 2 different components deployed in 4 servers ( load balanced ).
  • Logback used for logging.
  • logs are rotated at 80 MB and moved to a folder with name as the current date.
  • Each day 3.5 GB worth of application logs generated.

For most part of the below tutorial , I am going to be talking about setting up the ELK stack for a Java based application, but the setup part remains the same for almost any other application and you may just need to tweak the part where the application logs are parsed.


ELK Setup

So lets start with the building of ELK stack for your applications. I am using a windows based server and hence the setup is going to be for windows environment.

System requirements

The ELK system can be distributed across multiple systems and elasticsearch can operate in a clustered mode. But for my requirement a single system was enough with the following specifications:

  • Windows server 2012 R2
  • 32 core Xeon processor ( 2.4 Ghz )
  • 16 GB ram
  • 1 TB hard disk space.
  • Java 8 installed with JAVA_HOME setup properly.

You may need more or less based on the amount of log generated and the retention period you are planning for the same.

Downloading the components

Now , lets have the ELK components downloaded and configured. We can go to the specific component websites and download the zip file and installation is as simple as extracting the files. Please download the components zip files from the following links

Elasticsearch setup

Lets start with elastic search installation and setup as its required by the other components for starting properly.

  1. Create a folder elkstack in your d drive ( Or any drive that has got maximum space as elasitcsearch will be storing the indices / data in the drive and would eat up the maximum space).
  2. Extract the elasticsearch zip file to the above folder.
  3. Open a command window and cd to elkstack\elasticsearch\bin
  4. Execute  the following in the command prompt
    service install
    This will install the elastic search as a service
  5. Once the service is installed, we need to run it and set to start on boot.
  6. Execute the following command in the same command prompt
    service manager
    This will bring up a dialog window and you can select ‘Startup Type’ to be automatic and then click on ‘Start’ button. Alternatively you can start and stop the service by running service start or service stop from the same command window.
  7. Go to webbrowser and type in and you should see a json output on the screen which indicates that elastic search service is running fine.

Logstash setup

Once elasticsearch is setup, we can move to the next component Logstash. This is where we need to have some configuration changes. As this is the component that will be reading the logs and parsing them for extracting the fields, we need to make sure that the settings are proper to parse our generated log files.

  1. Extract the logstash zip file to the elkstack folder
  2. Download the custom configuration file for the logstash here
  3. Download the custom pattern we are using from here
  4. cd to elkstack\logstash\bin folder and replace the logstash.conf  with the one downloaded from the above link.
  5. Put the custompatterns file in the elkstack\logstash\patterns folder
  6. We need to edit the logstash.conf file to match your configurations


Editing logstash configuration

We need to have the logstash configuration edited to match the location and pattern for your log files. So start by editing the logstash.conf file and you would see something like below.


Editing the path to log files

The path field is the path where logstash would be looking for files. If you have multiple components generating logs, I suggest you put them in respective folders and put thes folders under master folder ( Eg: APP01,APP02, INTEGRATION01,INTEGRATION02 .. etc under ApacheTomcat folder). This  can use to segregate the logs in kibana using the query.

The /** means that the directory under ApacheTomcat would be recursively traversed.
The *.log means that we are looking for files with .log extension in those files.

The pattern for parsing the logs:

This was the only hard part I had to face when trying to setup the ELK stack. Getting your pattern correct. I have already created a pattern for the logback based logs files and you can readily use it.

Basically the pattern is called a grok pattern and we are extracting the parts of the logs ( timestamp, thread, log level, class  and actual message ). You can give name to the fields extracted and they will be stored using that name in elasticsearch and avaiable in kibana for filtering.

You can use the following sites to test your grok pattern. Just need to input the content from the log file and pattern below. If you are testing using the current pattern in the file, please do add the contents of customerpattern file to the custom pattern section of the testing tool.

Complete explanation of grok is beyond the scope of this post. Kindly check in google for more information on Grok.

Starting logstash

Once the above setups are done, we can start the logstash.

  1. Open a new command window
  2. cd to elkstack\logstash\bin
  3. Run the following command.
    logstash.bat agent -f logstash.conf


Setting up Kibana

Kibana is the UI for the logs in the elasticsearch and provides capabilities for creating dashboards and add panels for different visualization of the data and count. This also has the capability to write queries based on the fields identified in the logs by logstash.

  1. Extract the zip file to elkstack\kibana
  2. Download the default dashboard configuration from here
  3. Put the file default.json in elkstack\kibana\app\dashboards  replacing the current file in the folder
  4. In case you are giving a different filename , you need to edit  the file elkstack\kibana\config.js and replace the following line:
    default_route : ‘/dashboard/file/default.json’
    default_route : ‘/dashboard/file/nlp.json’
  5. Now  we need to add the kibana as a website in the IIS server.
    1. Open the IIS manager
    2. Create a new website with hostname as myelk
    3. Set the physical path to elkstack/kibana
    4. Start the website
  6. Go to http://myelk   and you should be able to see the default dashboard loading.


You can have kibana configured in different ways to have it search for a particular log  pattern and then visualize it using different widgets. Below is an example of a dashboard I have created.



I will be doing a different post on how to use kibana for creating dashboards like the one above. Thank you for your time and please comment below if you have any queries on the setup.

S 🙂


You may also like...

3 Responses

  1. May 30, 2016

    […] have already written a different post on setting up ELK in a windows environment. I have done the same in servers running Centos hosted in AWS with latest versions of each […]

  2. September 25, 2018

    […] Analyzing & monitoring application logs using ELK stack […]

  3. October 9, 2018

    […] Analyzing & monitoring application logs using ELK stack […]

Leave a Reply

Your email address will not be published. Required fields are marked *