Log Monitoring with ELK Stack.

Shubham Singh
4 min readApr 12, 2020

--

Elasticsearch Logstash and Kibana

ELK Represents:

  • Elasticsearch: Log aggregator
  • Logstash: Agent to send logs into Elasticsearch
  • Kibana: GUI web interface to search logs

ELK Stack Quickstart Guide

Install Ansible

>> sudo apt insatll ansible python-pip

Git Clone Ansible-Elasticsearch

>> git clone https://github.com/elastic/ansible-elasticsearch

Install pip and tox:

tox is an automation project we use to run our testing scenarios. It gives us the ability to create a dynamic matrix of many testing scenarios, isolated testing environments and provides a single entry point to run all tests in an automated and repeatable fashion

>>  apt install tox

Setup role:

>> cd ansible-elasticsearch
>> mkdir -p roles/elastic.elasticsearch
>> mv defaults/ handlers/ meta/ docs/ filter_plugins/ helpers/ tasks/ templates/ test/ vars/ files/ roles/elastic.elasticsearch
Video Promotion

Create an inventory file:

>> vim inventory""" Add the below line to inventory file """
localhost ansible_connection=local

Create main.yaml playbook:

>> vim main.yaml""" ADD FOLLOWING EXAMPLE PLAYBOOK TO main.yaml FILE """- name: Simple Example
hosts: localhost
roles:
- role: elastic.elasticsearch
vars:
es_version: 7.6.2

Run playbook:

>> sudo ansible-playbook -i inventory main.yaml

Configure Elasticsearch password for user “elastic” (default):

Go to directory /usr/share/elasticsearch/bin/elasticsearch-keystore add “bootstrap.password”

Install and Configure Kibana

Now let’s install and configure Kibana GUI web interface.

Install Kibana

>> apt install kibana

One important thing to Note is that make sure all the version no of all the components used match up correctly.

Install Nginx

>> apt install nginx

Lets Configure Nginx as a proxy to Kibana

Our Kibana Service run on 5601 port by default.

>> vim /etc/nginx/sites-enabled/default''' Add Following Line Of Code to it '''# Please see /usr/share/doc/nginx-doc/examples/ for more detailed examples.
# Default server configuratio
server {
listen 80;
#replace the server name with your IP or domain name
server_name 168.61.168.150;
location / {
proxy_pass http://127.0.0.1:5601;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Host $host;
proxy_set_header X-Forwarded-Port $server_port;
proxy_cache_bypass $http_upgrade;
auth_basic "Restricted Content";
auth_basic_user_file "/etc/nginx/htpasswd.users";
}
}

You can also refer the official tutorial to install and setup Kibana

Setup Nginx Authentication for Kibana

>> sudo sh -c "echo -n 'kibanaadmin:' >> /etc/nginx/htpasswd.users">> sudo sh -c "openssl passwd -apr1 >> /etc/nginx/htpasswd.users"Password: qwertyuiop# you can check your credentials here
>> cat /etc/nginx/htpasswd.users

Enter your password in console and verify it.

No, we have set up our kibana username as: kibanaadmin and password as qwertyuiop which we’ll use to get into kibana.

Configure Elasticsearch connection string in Kibana

>> vim /etc/kibana/kibana.yml''' Add these lines to kibana.yml file '''server.port: 5601
server.host: "0.0.0.0"
elasticsearch.username: "kibana"
elasticsearch.password: "qwertyuiop"
xpack.reporting.enabled: true
xpack.reporting.capture.browser.chromium.disableSandbox: false
logging.verbose: true

Let’s start our server

>> systemctl start elasticsearch
>> systemctl start kibana
>> systemctl restart nginx

Now you have successfully setup kibana and elasticsearch. Please go to your server_name and login inside kibana.

If you face any issue then please see the logs

>> journalctl -u elasticsearch
>> journalctl -u kibana
>> cat /var/log/nginx/error.log
# to test if elasticsearch is working properly or not
>> curl http://localhost:9200

Issue That I Faced while setting up my server.

While I was setting up kibana, I was facing this error log in nginx logs.

2020/04/13 13:43:08 [crit] 49662#49662: *152 crypt_r() failed (22: Invalid argument), client: 157.42.72.240, server: 168.61.168.150, request: “GET / HTTP/1.1”, host: “168.61.168.150”
Though I have solved this issue and updated this post, but if you want to know more about the issue you can refer

(GitHub issue: https://github.com/nginx-proxy/nginx-proxy/issues/643) and

https://stackoverflow.com/questions/31833583/nginx-gives-an-internal-server-error-500-after-i-have-configured-basic-auth

Finally Configure Logstash

Sending logs to Elasticsearch

  • Login to Kibana [simply go to the server name that you set up in Nginx]
  • Install Logstash :
  • Configure connection string to Elasticsearch [follow the instruction given.]
  • Check Kibana for logs

To set up Alert and Notification refer this link

If something goes wrong then please refer https://github.com/puppeteer/puppeteer/blob/master/docs/troubleshooting.md

Thank You and Happy Coding.
You can also read the article about Prometheus and grafana setup here

--

--