Recent Posts by Kirill Goltsman

Kirill Goltsman is a tech writer, blogger and technology enthusiast with over five years of experience. His portfolio includes research and blog articles on topics as diverse as cloud computing, machine learning, Artificial Intelligence, and web programming.

We cannot over-emphasize that the ELK stack is a great solution to ship, search, and analyze logs, system metrics, statistics, and other types of insight-driven data. You can utilize various components of the ELK stack such as Kibana to monitor what is happening in your cluster/s, host and applications getting instant insights guiding your business decisions. 

However, what options do we have for monitoring Elasticsearch itself? To make Elasticsearch serve request fast and ensure the health of the cluster, we need a good monitoring solution that helps identify issues as they arise. Fortunately, there are a lot of free monitoring tools available for the Elasticsearch, including Elasticsearch Kopf , Big Desk, or Whatson

In this article, we'll review one of the best web-based monitoring tools for Elasticsearch -- ElasticHQ. This plugin has been chosen as the built-in monitoring solution by Qbox for its hosted Elasticsearch 6.2.1 clusters.

Keep reading

Logstash ships with many input, codec, filter, and output plugins that can be used to retrieve, transform, filter, and send logs and events from various applications, servers, and network channels. 

In the previous tutorials, we discussed how to use Logstash to ship Redis logsindex emails using Logstash IMAP input plugin, and many other use cases. 

In this article, we continue our journey into the rich world of Logstash input plugins focusing on the Beats family (e.g., Filebeat and Metricbeat), various file and system input plugins, network, email, and chat protocols, cloud platforms, web applications, and message brokers/platforms. Logstash currently supports over 50 input plugin -- and more are coming -- so covering all of them in one article is not possible. Therefore, we decided to overview some of the most popular input plugin categories to give you a general picture of what you can do with Logstash. 

Keep reading

So you have moved all your applications to Docker and have begun enjoying all the fruits of lightweight and fast-to-deploy containers. 

That's great, but once you have multiple containers spread across multiple nodes, you'll need to find a way to track their health, storage, CPU, and memory usage, network load, etc. 

To track these metrics, you need an efficient monitoring solution and some backend store to keep your container data for subsequent analysis and processing. Managing thousands of Docker containers in production made our team here at Qbox quickly realize that Docker container monitoring is a valuable addition to our cluster management process. 

In a previous article, we discussed how to use Metricbeat to ship metrics from Kubernetes. Now, it's time to share our experience of using Metricbeat to monitor bare Docker containers and shipping container data to Elasticsearch and Kibana. This knowledge may be useful for developers and administrators who manage Docker containers without orchestration. Let's get started!

Keep reading

Kubernetes is a popular container orchestration and container management platform for automating deployment, scheduling, and update of your containerized workloads in distributed compute environments. It goes without saying that managing multiple nodes and applications in Kubernetes requires an efficient monitoring system. You need to have a real-time picture of events happening in your cluster to get actionable insights for optimization and improving performance. 

Kubernetes ships with some default monitoring and metrics solutions like Heapster and Kubernetes Metrics Server. However, in order to apply analytics, do data discovery, and visualize metrics data flowing from your cluster, you'll be better off using solutions designed specifically for such type of tasks. One popular option for log and metrics monitoring and analysis is the ELK stack (Elasticsearch, Logstash, Kibana) used in pair with Elastic Beats log shippers. 

In this article, we introduce you to monitoring Kubernetes with ELK and Elastic Beats. In particular, we'll show how to send Kubernetes metrics to Elasticsearch indexes using Metricbeat and access them in your Kibana dashboard for subsequent processing. Let's get started!

Keep reading

In Part II of the article, we'll focus on Qbox plugins that provide various third-party integrations including SQL, Neo4j graph platform, and Couchbase Transport, and we will examine language plugins that enhance querying and analysis of the text in Korean, Chinese, Polish, Hebrew, and several other languages.

Keep reading

In Part I of this overview, we'll explore the Qbox plugins for morphological and phonetic analysis, tokenization and concatenation, and native scripting, among others. By the end of this review, you'll have a better understanding of what plugins you might wish to install on your Qbox-hosted cluster.

Keep reading

The choice between self-hosted and managed Elasticsearch may involve multiple trade-offs that are hard to recognize. In this article, we'll compare these two alternatives: Qbox-hosted and self-hosted Elasticsearch to make you aware of salient pros and cons of these two options

Keep reading

"In this tutorial, we'll show how to create data visualizations with Kibana, a part of ELK stack that makes it easy to search, view, and interact with data stored in Elasticsearch indices."

Keep reading

Effective log management involves a possibility to instantly draw useful insights from millions of log entries, identify issues as they arise, and visualize/communicate patterns that emerge out of your application logs. Fortunately, ELK stack (Elasticsearch, Logstash, and Kibana) makes it easy to ship logs from your application to ES collections for storage and analysis. 

Recently, Elastic infrastructure was extended by useful tools for shipping logs called Beats. Filebeat is a part of Beats tool set that can be configured to send log events either to Logstash (and from there to Elasticsearch), or even directly to the Elasticsearch. The tool turns your logs into searchable and filterable ES documents with fields and properties that can be easily visualized and analyzed.

In a previous post, we discussed how to use Filebeat to ship Linux system logs. Now, it's time to show how to ship logs from your MySQL database via Filebeat transport to your Elasticsearch cluster. Making MySQL general and slow logs accessible via Kibana and Logstash will radically improve your database management, log analysis and pattern discovery leveraging the full potential of ELK stack.

Keep reading

In the previous tutorial, we have discussed how to use elasticsearch.js, the official Node.js client for Elasticsearch, to index, add documents, and search them using simple queries and Query DSL. In this tutorial, we're going to dive deeper into elasticsearch.js describing more advanced methods and concepts like scrolling, aggregations, and analyzers.

As always, we will be using hosted Elasticsearch on We assume that you have installed the latest version of Node.js, downloaded the elasticsearch.js module into your Node.js application and connected it to your Elasticsearch cluster as described in the previous tutorial.

Keep reading