Experiment with the Kibana 4 beta

Posted on 2014-12-01 by

A few weeks a go, the guys from elasticsearch made the public beta of Kibana 4 available. In this blog post I am going to generate some data and than play around with this data and the Kibana beta. If you have never heard about these tools, read the introduction, if you do know it, you can safely skip the introduction.

Introduction

Kibana rocket the world in graphical ui world of logs. Together with Logstash and elasticsearch it has never been so easy to visualise the logs of applications. I think it contributed a lot to the popularity of elasticsearch. There were some serious issues though with Kibana. For me security and the lack of support for aggregations were the biggest issues.

Kibana was installed as a website, or against best practices as a plugin in elasticsearch. Especially on your local machine it was so easy to deploy it as a site plugin. With version 4 this changed. Now Kibana comes as a separate product that you can start in itself. It still connects to the rest endpoint of elasticsearch, however your browser does not need access to the elasticsearch host.

If you are into Kibana 4, also have a look at my most recent post: Finding your blog abusers using Kibana 4 and logstash 1.5

Find out more information about downloading and installation of kibana on their website. We will continue with the installation and usage of a log generation tool after we have started Kibana.

http://www.elasticsearch.org/overview/kibana/installation/

If you did not start elasticsearch and Kibana yet, this is a good time. Installing Kibana is just extracting the zip or tar, step into the bin folder and run the kibana shell script or bat file. The output is as below

The Kibana Backend is starting up... be patient
{"@timestamp":"2014-11-27T17:41:44+01:00","level":"INFO","name":"Kibana","message":"Kibana server started on tcp://0.0.0.0:5601 in production mode."}

Generating logs

There is a very nice nodejs tool that can generate logs and insert them into elasticsearch as if they were ingested using logstash. Below you can find the website to install the tool.

https://www.npmjs.org/package/makelogs

Installation is easy using npm, if you install it globally interaction with the tool is also very easy.

npm install -g makelogs

Now we can generate the logs with the following command. In the example we are ingesting 10 million records divided over the past 5 days and today.

makelogs --count=10m --days=-5,0

If you did not start elasticsearch and Kibana yet, this is a good time

First look at Kibana

Using your favourite browser go to the url: http://localhost:5601. The first thing to do is to register at least one index pattern. Since we are simulation logstash data, we configure logstash-* and the @timestamp field aas the pattern and the required Time-field name for time based events. After pushing the create button we are redirected to a page showing all the fields and some information about these fields. Some things to notice in the mapping is that we have fields of type geo_point, we have a number of not analysed fields, some strings, numbers, an ip address. Interesting fields to play around with later on.

Figure – 1: Shows the creation of an index pattern
Create index pattern kibana
Figure – 2: Shows the fields of the index with their types and some other properties.
Created index fields kibana

If you want to you can keep playing around in the settings, I prefer moving to the Discover tab on the top left to see the data that we have. We are welcomed with a screen with a bar chart showing the amount of documents. On the left we see all the documents. I clicked on the _type field. Immediately I get an impression of the data in that field. In my case I can see I have two types: apache and nginx. If I would push the Visualise button I would go to the Visualise tab with a chart representing this data. But this is not what I want to do right now. Below you see the image of the screen after clicking the _type field.

Figure – 3: Shows what the screen looks like after clicking one of the fields.
Clicking the type field

First step is to create a query that only returns documents from apache or from nginx. In the next section we are going to create these queries and store them for later use.

Creating queries

With Kibana 4 we can create queries and store them for later use in the Visualise part. In the top search bar type the following query and execute it with the search button. Look closely at the screen. What happened to the _type field values on the left?

_type:apache

At the right of the search bar you can find a number of buttons, one of them is the save button with that strange image that nobody uses anymore :-). If you push it, you are asked a name and you can save the query. On the left of the save button there is a new search button. We push it to create the search for all nginx documents.

Figure – 4: Shows the pull down you get when clicking the save query button.
Save query in kibana

The result is that we have two named and saved queries now. You can check if it worked by using the third button at the right of the search bar. The one that enables you to load a query. Next we are going to use these saved queries in a Visualisation.

Create Visualisation

When creating a new visualisation we get some sort of wizard to create it. The first step is to create a query or to use a saved query. We use a saved query, the apache type documents. The next step is to chose a visualisation type. I want to do something with http status codes. So I choose a Pie chart. Than we are greeted with one big green pie. Not really what we want. So we need to add and aggregation. Push the add aggregation button and see what happens.

Figure – 5: Shows the screen before choosing a bucket type.
Choose bucket type

Is the amount of choices already puzzling you, don’t be scared. You cannot really break it. If you choose wrong, just push the create new visualisation button at the top right and start over. I take the Split Slices button for now. Than I do a terms aggregation and I have to choose a field. Do you see that the fields are grouped by their type? Cool stuff or not? I choose the response field, leave the default for top 5 and I get a nice chart. Since I only have three values and most of them are 200 I want something else. Therefore I change the field to angent.raw and push the apply button. That is better. The next diagram shows the result. Now use the save button again to store the visualisation: Agents from apache.

Figure – 6: Shows all different agents for apache requests.
Agents from apache

Now we can improve this even more. Create a new visualisation and this time use the option from an existing visualisation. Choose the one we just created. Now we are going to add a sub aggregation. This time again split slices, again a terms aggregation and the field machine.os. Than look at the really nice result. Store it as: Agents by apache by machine os.

Assignment: spot the improvement for the mapping of the documents.

Figure – 7: Shows the operating systems next to the agent.
Agents machine os by apache

Next up we are going to use the saved visualizations to create a dashboard

Creating the dashboard

Go to the Dashboard tap and use the mentioned button to add saved visualisations to the dashboard. Choose the two saved visualisations. Now do some dragging and sizing. Try to make the visualisations the same size and span each half of the screen. Notice the help you get when resizing in the grey boxes that show some defaults. Cool stuff. Next I store my dashboard as the apache dashboard.

Figure – 8: Shows the created dashboard with two visualisations
Apache dashboard

This is the first full cycle of creating a dashboard from visualisations using stored queries. Next up is doing some more advanced visualisations and settings.

Configuration and settings

Changing the time window

In the top right corner you can set the time filter. The default is 15 minutes. If you click the clock, you get the following window where you can change the period to some default periods. Very easy to do.

Figure – 9: Shows How to change the time interval that is used for the report
Config time period

Showing Request/Response

Kibana has a very nice way of investigating the executed query to create a chart and to see the response to create the chart as well. Completely at the bottom there is a very small icon that should give you the idea that you can click it to pull up a window. What a nice surprise when you do. You get the option to see the data in a table, see the request, the response and finally some statistics about the query. Really nice. The following image shows the screen.

Figure – 10: Shows that you can look at the request and the response from the elasticsearch server
Pull up for request response

Other visualisations

Creating a map

At the moment the map support is based on geohash. Since it is so easy to configure I am not going through all the steps. Just an image with the map.

Figure – 11: Shows an example of adding a map to the dashboard.
Showing the map

A nice area chart

The next chart is an area chart with a date histogram. As a sub aggregation I used the agent and I configured it to be of type area split.

Figure – 12: Example of a splitter chart.
Example area split

Finalising thoughts

So far the one thing I am missing most is that I cannot click in a diagram to add a filter, but this is on the roadmap. Another thing is that is it still pretty easy to bring elasticsearch to his knees. Knowing that not all users are going to be elasticsearch experts, it would be nice if there was some kind of safeguard in there. Maybe that will come as well. Don’t forget it is still just a beta. The final remark I want to make is about security. At the moment when running on localhost kibana is just a proxy for elasticsearch. So you can still execute all queries, you just have to change the url. The following curl request will create a new index and the second one will delete all indices matching the pattern.

curl -XPUT "http://localhost:5601/elasticsearch/jettro-kibana"
curl -XDELETE "http://localhost:5601/elasticsearch/jet*"

Other than that, it already has a lot more potential than kibana 3, which was already very nice. Keep up the good work guys.

References

Blog that communicated the availability of the new beta for Kibana

http://www.elasticsearch.org/blog/kibana-4-beta-2-get-now/

About Jettro Coenradie

I am a Software Developer / Architect with a lot of hands on experience in Java, AngularJS, Elasticsearch and lots of others tools. I like to use these technologies to help customers with the business challenges. On top of that I like to gather and share knowledge related to data analytics. I have experience with importing and transforming data as well as presenting and visualising the data. Currently I am working with tools like elasticsearch, logstash and Kibana but also D3 and C3 for graphics and other presentations.


7 Comments

  • kartik

    Hi , Can we trigger emails on a daily basis from Kibana 4. Can you help me with that?

    • Jettro Coenradie

      There is no email functionality in Kibana. There is an option in the new product called watcher that is now in beta.

    • Vidit Maniyar

      Though I love almost all the products that elastic.co has to offer, I don’t personally vouch for watcher. My ES crashed when I installed watcher. I am not sure I know the reason but I guess it had something to do with ES version. Having said that, I would strongly recommend you look at ElastAlert by Yelp. It has many customizable functionalities and awesome documentation. Good Luck!

  • How is the ELK stack licensed? Do you just pay support for Elastisearch, LogStash and Kibana and then the operating costs are on you? I hate the way Splunk charges by the GB processed. how much does ELS end up costing in practice?

Leave a Reply

Your email address will not be published. Required fields are marked *