The penetration testing world is fast moving and persistently demands new ideas, tools and methods for solving problems and breaking things. In recent years many people have gotten used to the idea of using Elasticsearch in the penetration testing workflow, most notably for hacking web applications.  

More and more companies and websites are opening bug bounty programs. If you have new tools in your arsenal that other people don’t use or understand yet, then you could be making a great deal more money from Bug Bounty hunting. This tutorial teaches you how to use new tools with Elasticsearch to give you that competitive edge. 

Keep reading

Having the ability to deploy Elasticsearch, Logstash and Kibana (ELK) from a single command is a wonderous thing. Together, in this post, we shall build the Ansible playbook to do just that.

There are some prerequisites. This Ansible playbook is made for Ubuntu Server and executed on Ubuntu Server 16.04. A basic system of 2 CPU cores and 4GB of RAM will be enough. The specs of the machine are entirely up to the situation and the volume of data.

This blog post is an alternative to using the ELK stack on Qbox. To easily deploy and run your own ELK setup on Qbox, simply sign up or launch your cluster here, and refer to the tutorial "Provisioning a Qbox Elasticsearch Cluster."

Keep reading

Parsing Logs Using Logstash

Posted by Vineeth Mohan March 17, 2016

In this tutorial series we are going to utilize the ELK (Elasticsearch-Logstash-Kibana) stack to parse, index, visualize, and analyze logs. Nearly all the processes in a server or in an application are logged into a log file. These log files act as a critical source in helping us to accomplish numerous things, ranging from troubleshooting to anomaly detection by effectively analyzing these logs.

For analyzing the logs, one should parse it into smaller components with appropriate fields and values. Then, index the components in a database and conduct the required analysis. One of the most reliable and scalable stack for these purposes is the ELK stack. Here we have the logs parsed and split into proper individual documents by Logstash. These documents then get indexed into the powerful text analytic engine, Elasticsearch, and lastly, are passed into the visualization tool Kibana.

In this edition of the ELK blog series we are going to see the setup, configuration, and a basic example of how to parse and index logs using Logstash.

Keep reading

We're always glad to see unsolicited kudos from independent sources. Alex Zhitnitsky over at Takipi has given Qbox a priority placement in his review of ELK-stack cloud services. He shares his own experiences in resolutely managing his home-grown ELK stack and then outlines the advantages of hosted Elasticsearch.

Some Qbox customers have much in common with his story, although many have special business and technical requirements. Now that Alex is enjoying the benefits of hosted Elasticsearch, he offers helpful advice from the perspective of a developer who has run the entire gamut. We respond to his review in this short article.

Keep reading

If you're a Qbox user or you've been reading this blog, then you probably know plenty about what Elasticsearch can do as a stand-alone product. In this article we present an overview of the entire ELK stack, which is a bundle of technologies that combine the entire ELK stack, which is a bundle of technologies that combine into a very powerful time-series analytics platform—and it's all open-source. Continue reading below as we explain how it works and how you can exploit it to manage your log data.

Keep reading