Qbox Joins Instaclustr, Plans March 31, 2023, Sunset Date. Read about it in our blog post here.

The penetration testing world is fast moving and persistently demands new ideas, tools and methods for solving problems and breaking things. In recent years many people have gotten used to the idea of using Elasticsearch in the penetration testing workflow, most notably for hacking web applications.

More and more companies and websites are opening bug bounty programs. If you have new tools in your arsenal that other people don’t use or understand yet, then you could be making a great deal more money from Bug Bounty hunting. This tutorial teaches you how to use new tools with Elasticsearch to give you that competitive edge.

One of the first big demonstrations of using Elasticsearch for hacking web applications was the release of the Burp Suite plugin that allows you to log to Elasticsearch. You will need Burp Suite pro to install this plugin and you can install it from the BApp Store.  The plugin is called “Report To Elastic Search”.


You can use this plugin either by installing Elasticsearch on your own machine, which I won’t recommend, or you can make use of Elasticsearch running on a remote server. You could use a hosted Elasticsearch service such as what Qbox offers or you could roll your own. To get started, having hosted Elasticsearch is probably the easiest and least time consuming.

For this post, I used hosted Elasticsearch on Qbox.io. You can sign up or launch your cluster here, or click “Get Started” in the header navigation. If you need help setting up, refer to “Provisioning a Qbox Elasticsearch Cluster.

The Report to Elasticsearch plugin is, by default, configured to connect to localhost on the default Elasticsearch port. We can use SSH local forwarding to make the plugin “think” that our remote Elasticsearch is running locally. This is what you would use to make your remote elasticsearch available on localhost:9200 on your local machine. In this example is our Elasticsearch server which is obviously not on my local machine.

$ ssh -f -N -L 9200: user@

Now we can send data to Elasticsearch or just interact with the Elasticsearch running on the remote machine as if it was running locally. For example, on my local machine I could list the indexes on my Elasticsearch cluster with:

$ curl localhost:9200/_cat/indices?pretty?true

The screenshot shows the output of running the curl machine on my local machine:


If you don’t have Burp Suite Pro, there is a similar plugin that you can install and use in Burp Suite Free. The hardest part of using this plugin is installing the plugin. This plugin seems to be still under development and you will need to figure out to install the plugin.

As with the previous plugin, you can have Elasticsearch and Kibana running on another machine as long as you use ssh local forwarding to access both of them. The plugin can be found here. The plugin is ElasticBurp; the file is here. This is the file that you up load to Burp as a plugin. The other files in the repo will also be useful to have on your local machine, but only that file is used as the plugin.

Case Study: How Qbox Saved 5 Figures per Month using Supergiant

You need to download the Jython standalone jar file and point Burp suite to it. You will also need to add the path to the directory where your python libraries are installed. On my machine it is: /usr/local/lib/python2.7/dist-packages/. See the screenshot on the options to set in Burp Suite’s extender section.


Clone the tool and cd into it’s directory. This is on your desktop machine.

$ git clone https://github.com/thomaspatzke/WASE
$ cd WASE/

The plugin ElasticBurp and the other tools that go with it have various dependencies. You will need to install them. Copy all the python2 code from the WASE directory. The correct way would be to only copy those written in python2.7 to that directory. The files need to be copied into the path where python stores your libraries:

$ sudo cp  *.py /usr/local/lib/python2.7/dist-packages/

Apply the correct permissions:

$ sudo chown root:staff  /usr/local/lib/python2.7/dist-packages/*.py

Install the other dependencies:

$ sudo pip install six tzlocal urllib3 elasticsearch elasticsearch_dsl

The WASE project contains different tools, so try not to get confused. Wase/WaseQuery.py is written in Python3 so you will have to install some python3 dependencies in order to use that. You can do that with:

$ sudo pip3 install elasticsearch_dsl elasticsearch_dsl

In the WASE directory there are two directories which are symlinked to other files — you can delete them.

$ rm -rf elasticsearch/ 
$ rm -rf elasticsearch_dsl/

If you are done installing the plugin you can start using it. The plugin creates an index in Elasticsearch, so you don’t need to make an index.

I suggest you do ssh local forwarding for the ports that Kibana and Elasticsearch run on — before you start up the plugin. You also need to start the Elasticsearch and Kibana services on your remote server.

Run this on your local machine to enable ssh local forwarding:

$ ssh -f -N -L 5601: timo@
$  ssh -f -N -L 9200: timo@

Now you can start Burp suite with the command below. Remember to set your browser to proxy data via once Burp suite has started.

$ java -jar burpsuite_free_v1.7.06.jar

Now you need to install and enable it again in extender in Burp suite pro. See screenshot:


If everything went well then you should see something like this:


Perhaps you don’t want to log all the results to the same index. You can log your results to a different index name related to the target that you are testing. Perhaps you just want to add a timestamp to the index name in order to not log everything to one large index.

All of these settings can be set in Burp Suite in the ElasticBurp tab as seen on this screenshot. I changed my index name to: wase-burp-bug-bounty.


You can now start browsing your given target using either Burp or a combination of Burp and your browser. You can right click on a response and select “Add To Elasticsearch Index” to make a response searchable in Elasticsearch:


This allows you to search over responses for specific keywords of interest, for example: “error”. Perhaps there is something else you wanted to search for such as comments in the html. This is how you search for something in the index using Kibana. You can create an index in Kibana.

See the screenshot below where we create an index pattern for the new index name we chose to use for ElasticBurp:


We have now made our index searchable in Kibana:


Now you can search for a keyword, for example. The example below does not show a very distinctive keyword. Nonetheless, we search for responses that set a cookie in the user’s browser by searching for the keyword Cookie:


There are a few tasks that are performed during the enumeration stage of a penetration tests that is definitely worth logging to elasticsearch. This is good for reporting purposes and also for quick search and reference during the penetration test.

One of these tasks is finding subdomains. There are different tools that you can use for the job, one of them is knock also known as knockpy. You can install it with:

$ mkdir /opt/pentest
$ chown `whoami`:`whoami` /opt/pentest/
$ cd /opt/pentest
$git clone https://github.com/guelfoweb/knock
$ cd knock
$ sudo python setup.py install

Now we can run knock on an example subdomain. I’m going to run it on yahoo.com as they actually have a very cool bugbounty program:

 python knockpy/knockpy.py yahoo.com
Target information yahoo.com
Ip Address        Target Name
----------        -----------     yahoo.com     yahoo.com    yahoo.com

Knock leaves behind report files which is in the form of a CSV file. We can index this into Elasticsearch using Logstash. This is what the CSV file looks like:

$ head yahoo_com_1475932483.72.csv 
target,ip address,domain name,type

As you can see the structure of the file is pretty simple we can index it with Logstash with very little effort. To make things easier we can create a mapping file. We can create our mapping using PUT in Sense. This is what we use for our mapping:

PUT knockpy
   "mappings": {
     "logs": {
       "properties": {
         "@timestamp": {
           "type": "date",
           "format": "strict_date_optional_time||epoch_millis"
         "@version": {
           "type": "string"
         "address": {
           "type": "string",
            "index":    "not_analyzed"
         "domain name": {
           "type": "string",
            "index":    "not_analyzed"
         "host": {
           "type": "string",
            "index":    "not_analyzed"
         "ip": {
           "type": "string",
            "index":    "not_analyzed"
         "message": {
           "type": "string",
            "index":    "not_analyzed"
         "path": {
           "type": "string",
            "index":    "not_analyzed"
         "target": {
           "type": "string",
            "index":    "not_analyzed"


Next we can run our config on the *.csv files that knock has created so we can index our data into Elasticsearch using Logstash. This is what we put in our config:

input {
 file {
   path => "/opt/pentest/knock/*.csv"
   start_position => "beginning"
   sincedb_path => "/dev/null"
filter {
 csv {
     separator => ","
     columns => ["target","ip","address","domain name","type"]
output {
   elasticsearch {
    hosts => "http://localhost:9200"
     index => "knockpy"
stdout {}

We can run our config with:

$ /opt/logstash/bin/logstash -f knockpy.conf



This is just a showcase of indexing the results of two well known penetration testing tools into Elasticsearch. You could index the results of many other penetration testing tools — if you do some research, or if you change their code to add the functionality. By indexing the results of different tools that we use we can make penetration testing workflow smoother, and we can find things that we could have possibly missed.

Make sure if you make use of Elasticsearch that you lock it down, especially if you log sensitive data that could be use by attackers to find attack points that you found for a given target. Questions/Comments? Drop us a line below.

Related Articles

Give It a Whirl!

It’s easy to spin up a standard hosted Elasticsearch cluster on any of our 47 Rackspace, Softlayer, Amazon or Microsoft Azure data centers. And you can now provision a replicated cluster.

Questions? Drop us a note, and we’ll get you a prompt response.

Not yet enjoying the benefits of a hosted ELK-stack enterprise search on Qbox? We invite you to create an account today and discover how easy it is to manage and scale your Elasticsearch environment in our cloud hosting service.