Thursday, 26 May 2016

Log Management for Play applications with Logstash, Elastic Search and Kibana

Log Management for Play applications with Logstash, Elastic Search and Kibana


There are numerous posts available online which guides us on how to set up Elastic Search, Logstash and Kibana(ELK) for log analysis. But mostly they are confusing and an overkill, I had a tough time corroborating all of that. So to save everybody's time I am writing this and putting up in the most simplistic manner.

Well ELK is not a new concept. So I wont get into the intricacies of the three systems rather will help you in setting up the system from ground up.

I am using a play application to push data into logstash directly over tcp. Elastic search is used for searching and kibana for visualisation.

STEP 1:

We need to add Logstash Logback encoder in build.sbt.

libraryDependencies += "net.logstash.logback" % "logstash-logback-encoder" % "4.5.1"

I am working with play 2.5.0. So for me this dependency worked. I tried with the latest version but it was throwing errors so I stuck with 4.5.1 version.

This step is used to push application logs to logstash node.

STEP 2:

Add the logging configuration in your logback.xml

And then add this appender to your log level

In this example my logstash is running locally and listening on port 5044 for the logs.

STEP 3:

Installing logstash

(i)   Download the latest version of logstash from  https://www.elastic.co/downloads/logstash. I have used logstash-2.3.1
(ii)  Unzip the folder
(iii) Create a logstash.conf file and keep in the logstash folder. Add the following contents to this conf file.

input {
  tcp {
    port => 5044
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]

  }
}

filter {

json {
    source => "message"
  }
  ruby {
        code => "
          event.to_hash.keys.each { |k| event[ k.sub('.','_') ] = event.remove(k) if k.include?'.' }
        "
    }
}

The first part is the input configuration to logstash. It uses tcp to accept incoming logs and listening on port 5044.

The second part is the output configuration. This puts the logs collected onto elastic search running locally on port 9200.

The third part is filter section which says that incoming logs are of type json. And the ruby filter is used to replace dots with dash. This is a bug in elastic search. It throws up parsing exception.

(iv) Finally start your logstash with below command

sh bin/logstash -f logstash.conf


STEP 4:

Installing Elastic Search

(i) Download the elastic search from https://www.elastic.co/downloads/elasticsearch
(ii) Unzip the file
(iii) Open the config/elasticsearch.yml file. Look for network.host, uncomment that and replace the default content with localhost. And also uncomment http.port and value 9200 to that.

# ---------------------------------- Cluster -----------------------------------
#
# Use a descriptive name for your cluster:
#
cluster.name: my-application

# ------------------------------------ Node ------------------------------------
#
# Use a descriptive name for the node:
#
node.name: node-1

# ---------------------------------- Network -----------------------------------
#
# Set the bind address to a specific IP (IPv4 or IPv6):
#
network.host: localhost
#
# Set a custom port for HTTP:
#
http.port: 9200

(iv) Start elastic search using following command.


sh bin/elasticsearch

(v) After this you can check whether the set up is done correctly by hitting http://localhost:9200/

You would see following


{
  "name" : "Colleen Wing",
  "cluster_name" : "elasticsearch",
  "version" : {
    "number" : "2.3.3",
    "build_hash" : "218bdf10790eef486ff2c41a3df5cfa32dadcfde",
    "build_timestamp" : "2016-05-17T15:40:04Z",
    "build_snapshot" : false,
    "lucene_version" : "5.5.0"
  },
  "tagline" : "You Know, for Search"
}


STEP 5:

Installing Kibana

(i) Download the latest version from https://www.elastic.co/downloads/kibana
(ii)Unzip the downloaded file.
(iii) Open config/kibana.yml. Look for following keys and uncomment the same and add the values as shown


# Kibana is served by a back end server. This controls which port to use.
server.port: 5601

# The host to bind the server to.
server.host: localhost

# If you are running kibana behind a proxy, and want to mount it at a path,
# specify that path here. The basePath can't end in a slash.
# server.basePath: ""

# The maximum payload size in bytes on incoming server requests.
# server.maxPayloadBytes: 1048576

# The Elasticsearch instance to use for all your queries.
elasticsearch.url: "http://localhost:9200"


So kibana is running on localhost, port 5601 and will use elastic search instance running on 9200.

(iv) Start kibana using following command


sh bin/kibana


If everything is correct you can hit http://localhost:5601/

You can see the logs coming.