To Centralize Your Docker Logs with Fluentd and ElasticSearch on Kubernetes
Introduction
In this blog we will deploy Elasticsearch,Fluentd and Kibana for centralized log aggregation in Kubernetes cluster.
So I am assuming you that you have running Kubernetes cluster .
Fluentd : So fluentd will be running on as Daemonset in your Kube cluster it will collect logs from all the nodes and forward to ElasticSearch service.
ElasticSearch: It is a search engine based on Lucene. It provides a distributed, multitenant-capable full-text search engine with an HTTP web interface and schema-free JSON documents
Kibana : It is a web interface that can be used to search and view the logs that Logstash has indexed
We are mounting /var/lib/docker/containers from host machine to docker container at /var/log/containers/ (/var/lib/docker/containers is the location where all the containers created ). So fluentd will read all the JSON logs from this location .
Note: Basically, an app should write all the logs directly to stdout/stderr. So Fluentd will be able to collect application logs.
All this configuration is mentioned in td-agent.conf file inside container .
<source>
type tail
path /var/log/containers/*.log
pos_file /var/log/es-containers.log.pos
time_format %Y-%m-%dT%H:%M:%S.%NZ
tag kubernetes.*
format json
read_from_head true
</source>
# Example:
# 2015–12–21 23:17:22,066 [salt.state ][INFO ] Completed state [net.ipv4.ip_forward] at time 23:17:22.066081
<source>
type tail
format /^(?<time>[^ ]* [^ ,]*)[^\[]*\[[^\]]*\]\[(?<severity>[^ \]]*) *\] (?<message>.*)$/
time_format %Y-%m-%d %H:%M:%S
path /var/log/salt/minion
pos_file /var/log/es-salt.pos
tag salt
</source>
So to setup the Elasticsearch Cluster and a kibana frontend :
- Deploy elasticsearch as Replication controller and attach service to it.
- Run Kibana as Deployment object and expose it as NodePort to access from outside world.
- Deploy Fluentd as Daemonset so it will collect logs from all the nodes.
You can check running pods with the help of below command:
core@master ~/ekf/kub-efk-stack $ kubectl get pods — namespace=kube-system
NAME READY STATUS RESTARTS AGE
elasticsearch-logging-293c9 1/1 Running 0 3h
elasticsearch-logging-dwlks 1/1 Running 0 3h
fluentd-es-v1.22-cq31q 1/1 Running 0 3h
fluentd-es-v1.22-r98cs 1/1 Running 0 3h
fluentd-es-v1.22-v3g2b 1/1 Running 0 3h
kibana-logging-974994387-jgqr1 1/1 Running 0 3h
Now you can access your Kibana dashboard and check logs.
You can refere my github repo for more details .