No Comments

Create a real-time environment to monitor Container Logs using Global ELK

Ayandeep Das | Technical Specialist - ETL, Ashnik
Mumbai, 17 Dec 2019

by , , No Comments


Ashnik’s technical squad is constantly designing solutions which can be implemented to resolve distinct business challenges. Here’s one such interesting scalable solution devised by us to monitor container logs in real-time using the global ELK stack outside the openshift container; instead of using EFK within the containers. In order to tackle this exact issue, this solution was developed as a ready reckoner to be able to address similar issues at any customers’ environments.
One of the pain points we often hear is how most customer teams aren’t able to gather logs in a consolidated manner from all the data centres. And mostly, there will be different containers running in different data centres where EFK (Elasticsearch, Fluentd, Kibana) is installed separately as a daemon set, running in each container. In such a situation, one will come up with multiple EFK sets installed in separate data centres, increasing cost heavily and yet not providing a centralized view of all the container logs. Besides, Fluentd due to its limited filtering features, may also not end up capturing all the vital details.

Values of the Solution

  • We removed the dependency of ELK on the container platform, acting as a Global ELK platform which will ingest from multiple data centres and containers in a consolidated manner in a centralized ELK platform.
  • Hence, minimal cost and maintenance required for sustaining the infrastructure as there’s only 1 ELK platform being used instead of multiple EFK in each container.
  • Using Logstash enables the feature to use over 200 filter plugins, which provide numerous business insights on the incoming logs and that too, very easily.
  • This provides a highly scalable and resilient architecture which can be managed globally without interfering with the respective containers.
  • Prevents multiple restarts on the containers, for any changes in the source data.
  • Very minimal load on the source system i.e the containers as Filebeat or Metricbeat become lightweight data shippers in comparison to Fluentd.
  • This platform can be used as a service everytime new containers are getting added.

The Solution Itself

With over 10 years of enterprise expertise and varying industry-specific engagements, Ashnik is constantly working towards innovating some challenge breakers. After understanding some customer pain-points and ongoing engagements hiccups, this solution of ‘using external ELK outside the openshift container platform’ was engineered. The one that provides high scalability, resiliency and performance. If you too are in a similar journey, talk to us. For the moment, the solution has been explained below:

Let’s say, you are using Openshift container 3xx version. Install Filebeat at node level where the container logs are located. Filebeat is installed using rpm and is run as a service.
Please note for this we need to have a public IP for the containers. This will not be applicable in Openshift version 4xx or above.
Filebeat will be reading the container logs and adding the container metadata along with it in real time.

Create a real-time environment to monitor Container Logs using Global ELK

Logstash, Elasticsearch and Kibana will be installed outside the Openshift and will run as a service + will be fed with logs coming from Filebeat.
Logstash will be utilized for filtering the logs and to provide more insights on the incoming logs and then be ingested into Elasticsearch.
Elasticsearch is installed with a minimum 3-node configuration. You can scale the number of nodes based on the volume of data coming from filebeat and subsequently can also go with the hot and warm architecture. Here, we have deployed and configured a 3-node Elasticsearch setup.
Dashboards can be created in Kibana to show a graphical view of the insights, added along with the default container metadata.

The Conclusion

The ELK architecture can be used in any environment where one needs to monitor data in real-time, whether its logs or sensor data. ELK and its features for monitoring, providing business insights and enhanced dashboarding features are incomparable and can’t be beaten.
If you’re using Elasticsearch in its containerised platform but not using it in an efficient manner would not provide you with high-scalability and resiliency. This is precisely where team Ashnik can provide its expertise in implementing a globalised and centralized ELK – ensuring your architecture is highly-scalable and resilient within minimal costs.
As every organization or customer says, technology can easily be adopted in any environment but using it optimally and accurately is what will tend to deliver success. And, this is the very essence of Ashnik’s growing solution-based offerings across Southeast Asia and India. Facing something similar, drop us a quick note on and we’ll connect to deliver!


  • Ayandeep is a Technical Specialist – ETL at Ashnik, Mumbai. He is instrumental in growing Ashnik’s business through his technical engagements and is a Subject Matter Expert in Pentaho & Big Data Solutions. He has over 8 years of experience in designing and developing solutions on technologies like Pentaho, ETL, Big Data, Oracle, PLSQL, Core Java, Spark, Kafka.

More From Ayandeep Das | Technical Specialist - ETL, Ashnik :