The penetration testing world is fast moving and persistently demands new ideas, tools and methods for solving problems and breaking things. In recent years many people have gotten used to the idea of using Elasticsearch in the penetration testing workflow, most notably for hacking web applications.  

More and more companies and websites are opening bug bounty programs. If you have new tools in your arsenal that other people don’t use or understand yet, then you could be making a great deal more money from Bug Bounty hunting. This tutorial teaches you how to use new tools with Elasticsearch to give you that competitive edge. 

One of the first big demonstrations of using Elasticsearch for hacking web applications was the release of the Burp Suite plugin that allows you to log to Elasticsearch. You will need Burp Suite pro to install this plugin and you can install it from the BApp Store.  The plugin is called “Report To Elastic Search”.

1_Report_To_Elasticsearch.png#asset:1218

You can use this plugin either by installing Elasticsearch on your own machine, which I won’t recommend, or you can make use of Elasticsearch running on a remote server. You could use a hosted Elasticsearch service such as what Qbox offers or you could roll your own. To get started, having hosted Elasticsearch is probably the easiest and least time consuming.

For this post, I used hosted Elasticsearch on Qbox.io. You can sign up or launch your cluster here, or click "Get Started" in the header navigation. If you need help setting up, refer to "Provisioning a Qbox Elasticsearch Cluster."

The Report to Elasticsearch plugin is, by default, configured to connect to localhost on the default Elasticsearch port. We can use SSH local forwarding to make the plugin “think” that our remote Elasticsearch is running locally. This is what you would use to make your remote elasticsearch available on localhost:9200 on your local machine. In this example 172.20.0.203 is our Elasticsearch server which is obviously not on my local machine.

$ ssh -f -N -L 9200:127.0.0.1:9200 user@172.20.0.203

Now we can send data to Elasticsearch or just interact with the Elasticsearch running on the remote machine as if it was running locally. For example, on my local machine I could list the indexes on my Elasticsearch cluster with:

$ curl localhost:9200/_cat/indices?pretty?true

The screenshot shows the output of running the curl machine on my local machine:

make-elasticsearch-listen-locally_and_qu

If you don’t have Burp Suite Pro, there is a similar plugin that you can install and use in Burp Suite Free. The hardest part of using this plugin is installing the plugin. This plugin seems to be still under development and you will need to figure out to install the plugin. 

As with the previous plugin, you can have Elasticsearch and Kibana running on another machine as long as you use ssh local forwarding to access both of them. The plugin can be found here. The plugin is ElasticBurp; the file is here. This is the file that you up load to Burp as a plugin. The other files in the repo will also be useful to have on your local machine, but only that file is used as the plugin. 

Case Study: How Qbox Saved 5 Figures per Month using Supergiant

You need to download the Jython standalone jar file and point Burp suite to it. You will also need to add the path to the directory where your python libraries are installed. On my machine it is: /usr/local/lib/python2.7/dist-packages/. See the screenshot on the options to set in Burp Suite’s extender section.

burp-suite-plugin-options-A.png#asset:12

Clone the tool and cd into it’s directory. This is on your desktop machine.

$ git clone https://github.com/thomaspatzke/WASE
$ cd WASE/

The plugin ElasticBurp and the other tools that go with it have various dependencies. You will need to install them. Copy all the python2 code from the WASE directory. The correct way would be to only copy those written in python2.7 to that directory. The files need to be copied into the path where python stores your libraries:

$ sudo cp  *.py /usr/local/lib/python2.7/dist-packages/

Apply the correct permissions:

$ sudo chown root:staff  /usr/local/lib/python2.7/dist-packages/*.py

Install the other dependencies:

$ sudo pip install six tzlocal urllib3 elasticsearch elasticsearch_dsl

The WASE project contains different tools, so try not to get confused. Wase/WaseQuery.py is written in Python3 so you will have to install some python3 dependencies in order to use that. You can do that with:

$ sudo pip3 install elasticsearch_dsl elasticsearch_dsl

In the WASE directory there are two directories which are symlinked to other files -- you can delete them.

$ rm -rf elasticsearch/ 
$ rm -rf elasticsearch_dsl/

If you are done installing the plugin you can start using it. The plugin creates an index in Elasticsearch, so you don’t need to make an index. 

I suggest you do ssh local forwarding for the ports that Kibana and Elasticsearch run on -- before you start up the plugin. You also need to start the Elasticsearch and Kibana services on your remote server.

Run this on your local machine to enable ssh local forwarding:

$ ssh -f -N -L 5601:127.0.0.1:5601 timo@172.20.0.203
$  ssh -f -N -L 9200:127.0.0.1:9200 timo@172.20.0.203

Now you can start Burp suite with the command below. Remember to set your browser to proxy data via 127.0.0.1:8080 once Burp suite has started.

$ java -jar burpsuite_free_v1.7.06.jar

Now you need to install and enable it again in extender in Burp suite pro. See screenshot:

install-elasticburp-extender.png#asset:1

If everything went well then you should see something like this:

elastic-burp-successful-instaleld.png#as

Perhaps you don’t want to log all the results to the same index. You can log your results to a different index name related to the target that you are testing. Perhaps you just want to add a timestamp to the index name in order to not log everything to one large index. 

All of these settings can be set in Burp Suite in the ElasticBurp tab as seen on this screenshot. I changed my index name to: wase-burp-bug-bounty.

1_changed-index-name-WASE-A.png#asset:12

You can now start browsing your given target using either Burp or a combination of Burp and your browser. You can right click on a response and select “Add To Elasticsearch Index” to make a response searchable in Elasticsearch:

add-to-elasticsearch-index.png#asset:121

This allows you to search over responses for specific keywords of interest, for example: “error”. Perhaps there is something else you wanted to search for such as comments in the html. This is how you search for something in the index using Kibana. You can create an index in Kibana. 

See the screenshot below where we create an index pattern for the new index name we chose to use for ElasticBurp:

3_create_index_pattern_elasticburp_in_ki

We have now made our index searchable in Kibana:

4_wase_searchable_kibana.png#asset:1211

Now you can search for a keyword, for example. The example below does not show a very distinctive keyword. Nonetheless, we search for responses that set a cookie in the user’s browser by searching for the keyword Cookie:

5_search_keyword_cookie.png#asset:1213

There are a few tasks that are performed during the enumeration stage of a penetration tests that is definitely worth logging to elasticsearch. This is good for reporting purposes and also for quick search and reference during the penetration test. 

One of these tasks is finding subdomains. There are different tools that you can use for the job, one of them is knock also known as knockpy. You can install it with:

$ mkdir /opt/pentest
$ chown `whoami`:`whoami` /opt/pentest/
$ cd /opt/pentest
$git clone https://github.com/guelfoweb/knock
$ cd knock
$ sudo python setup.py install

Now we can run knock on an example subdomain. I’m going to run it on yahoo.com as they actually have a very cool bugbounty program:

 python knockpy/knockpy.py yahoo.com
Target information yahoo.com
Ip Address        Target Name
----------        -----------
206.190.36.45     yahoo.com
98.139.183.24     yahoo.com
98.138.253.109    yahoo.com

Knock leaves behind report files which is in the form of a CSV file. We can index this into Elasticsearch using Logstash. This is what the CSV file looks like:

$ head yahoo_com_1475932483.72.csv 
target,ip address,domain name,type
about.yahoo.com,217.12.1.156,about.yahoo.com,alias
about.yahoo.com,217.12.1.155,fd-geoycpi-uno.gycpi.b.yahoodns.net,host
about.yahoo.com,66.196.66.213,fd-geoycpi-uno.gycpi.b.yahoodns.net,host
about.yahoo.com,66.196.66.212,fd-geoycpi-uno.gycpi.b.yahoodns.net,host
about.yahoo.com,217.12.1.156,fd-geoycpi-uno.gycpi.b.yahoodns.net,host
ad.yahoo.com,204.71.200.45,ad.yahoo.com,alias
ad.yahoo.com,204.71.200.45,a5.yahoo.com,host
adkit.yahoo.com,217.12.13.41,adkit.yahoo.com,alias
adkit.yahoo.com,217.12.13.40,adspecs.yahoo.com,alias
...

As you can see the structure of the file is pretty simple we can index it with Logstash with very little effort. To make things easier we can create a mapping file. We can create our mapping using PUT in Sense. This is what we use for our mapping:

PUT knockpy
{
   "mappings": {
     "logs": {
       "properties": {
         "@timestamp": {
           "type": "date",
           "format": "strict_date_optional_time||epoch_millis"
         },
         "@version": {
           "type": "string"
         },
         "address": {
           "type": "string",
            "index":    "not_analyzed"
         },
         "domain name": {
           "type": "string",
            "index":    "not_analyzed"
         },
         "host": {
           "type": "string",
            "index":    "not_analyzed"
         },
         "ip": {
           "type": "string",
            "index":    "not_analyzed"
         },
         "message": {
           "type": "string",
            "index":    "not_analyzed"
         },
         "path": {
           "type": "string",
            "index":    "not_analyzed"
         },
         "target": {
           "type": "string",
            "index":    "not_analyzed"
         }
       }
     }
   }
 }

1_PUT_MAPPING.png#asset:1219

Next we can run our config on the *.csv files that knock has created so we can index our data into Elasticsearch using Logstash. This is what we put in our config:

input {
 file {
   path => "/opt/pentest/knock/*.csv"
   start_position => "beginning"
   sincedb_path => "/dev/null"
 }
}
filter {
 csv {
     separator => ","
     columns => ["target","ip","address","domain name","type"]
 }
}
output {
   elasticsearch {
    hosts => "http://localhost:9200"
     index => "knockpy"
 }
stdout {}
}

We can run our config with:

$ /opt/logstash/bin/logstash -f knockpy.conf

subdomains_visualize_kibana.png#asset:12

Conclusion

This is just a showcase of indexing the results of two well known penetration testing tools into Elasticsearch. You could index the results of many other penetration testing tools -- if you do some research, or if you change their code to add the functionality. By indexing the results of different tools that we use we can make penetration testing workflow smoother, and we can find things that we could have possibly missed. 

Make sure if you make use of Elasticsearch that you lock it down, especially if you log sensitive data that could be use by attackers to find attack points that you found for a given target. Questions/Comments? Drop us a line below.

Related Articles

Give It a Whirl!

It's easy to spin up a standard hosted Elasticsearch cluster on any of our 47 Rackspace, Softlayer, Amazon or Microsoft Azure data centers. And you can now provision a replicated cluster.

Questions? Drop us a note, and we'll get you a prompt response.

Not yet enjoying the benefits of a hosted ELK-stack enterprise search on Qbox? We invite you to create an account today and discover how easy it is to manage and scale your Elasticsearch environment in our cloud hosting service.

comments powered by Disqus