“ChatOps” has recently become a buzzword in places that are aiming for continuous delivery. It is based on chat clients like Slack and Hipchat and is plugged in with chatbots for real-time communication and task execution among members of development and IT operations teams.

Chat has become an integral part of the “better” delivery models. With huge amounts of data flowing within the system, wouldn’t it be nice if we could put it into an analysis tool and churn out some results that might improve the business?

In this article, we explore how to integrate Slack with Elasticsearch and perform basic data analyses for examples.

Tutorial

For this post, we will be using hosted Elasticsearch on Qbox.io. You can sign up or launch your cluster here, or click "Get Started" in the header navigation. If you need help setting up, refer to "Provisioning a Qbox Elasticsearch Cluster."

Logstash provides various input plugins to extract data from other systems. One such input plugin is IRC. Because Slack has the IRC feature, we can leverage the IRC integration of Slack and Logstash and stream the data to Elasticsearch from Slack via Logstash.

IRC channel is not enabled in Slack by default. To do that, navigate to the “Admin - Settings & Permissions” page, and in the “Permissions” tab, you can find the “Gateways” section. Expand it and enable IRC (as shown below), and click on Save.

slack-elasticsearch1.png#asset:1181

With IRC enabled, we are just one step away from completing the integration. In order to connect to Slack via IRC, we need to have the host, user, and password details. Credentials can be obtained by navigating to “Gateways” page. Under the section “Getting Started: IRC”, you can find the details. Once these details are fetched, we can start configuring the Logstash’s IRC input plugin with the collected details so far.

Blog Post:  Microservices, Supergiant Architecture for Stability and Scale

The IRC input plugin mandates host and channels as required parameters. There are other parameters like user, type, nick, which are optional. In order to connect to Slack, you will need to fill in the correct values for host, channels, user, nick, and password parameters.

To stream the collected outputs to Elasticsearch, configure the “output” with Elasticsearch details. Also, configure the stdout so that we can see the output that is ingested into Elasticsearch on the console. The following shows how a Logstash conf file will look once all the relevant values are filled in:

#slack.conf
input {
irc {
           host => "esqboxgroup.irc.slack.com"
#IRC channels
channels => [ "#elk_team", "#es_experts" ]
user => "es_qbox"
           nick => "es_qbox"
password => "esqboxgroup.M########"
real => "Slack Integration"
secure => true
tags => ["slack"]
type => "chatops"
}
}
output {
           stdout { codec => rubydebug }
           elasticsearch {
                       index => "chatops"
                       document_type => "slack"
                       hosts => "localhost:9200"
           }
 
}

Now, start the Logstash agent so that it starts listening to the incoming messages from Slack:

 $logstash_home>bin/logstash -f slack.conf

Once the agent is started, it starts streaming all the incoming messages to Elasticsearch. As we had configured “stdout” in the output, we can see those incoming messages on the terminal console, too:

slack-elasticsearch2.png#asset:1182

The same can be verified by running a search on Elasticsearch. The following is a sample output when a search is executed against the newly created index that contains the messages from Slack:

{
 "took" : 5,
 "timed_out" : false,
 "_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
 },
 "hits" : {
"total" : 6,
"max_score" : 1.0,
"hits" : [ {
     "_index" : "chatops",
     "_type" : "slack",
     "_id" : "AVgMR63kyQGHpxoeLE0L",
     "_score" : 1.0,
     "_source" : {
       "message" : "Cloud computing is the next big thing",
       "@version" : "1",
       "@timestamp" : "2016-10-28T17:12:53.698Z",
       "user" : "es_qbox!es_qbox@irc.tinyspeck.com",
       "command" : "PRIVMSG",
       "channel" : "#elk_team",
       "nick" : "es_qbox",
       "server" : "esqboxgroup.irc.slack.com:6667",
       "host" : "irc.tinyspeck.com"
     }
}, {
     "_index" : "chatops",
     "_type" : "slack",
     "_id" : "AVgLM2xGyQGHpxoeLE0G",
     "_score" : 1.0,
     "_source" : {
       "message" : "Qbox has the Best ES Experts",
       "@version" : "1",
       "@timestamp" : "2016-10-28T12:11:08.884Z",
       "type" : "chatops",
       "tags" : [ "slack" ],
       "user" : "es_qbox!es_qbox@irc.tinyspeck.com",
       "command" : "PRIVMSG",
       "channel" : "#es_experts",
       "nick" : "es_qbox",
       "server" : "esqboxgroup.irc.slack.com:6667",
       "host" : "irc.tinyspeck.com"
     }
}, {
     "_index" : "chatops",
     "_type" : "slack",
     "_id" : "AVgLMrcmyQGHpxoeLE0E",
     "_score" : 1.0,
     "_source" : {
       "message" : "#elk_team  hello",
       "@version" : "1",
       "@timestamp" : "2016-10-28T12:10:21.929Z",
       "type" : "chatops",
       "tags" : [ "slack" ],
       "user" : "es_qbox!es_qbox@irc.tinyspeck.com",
       "command" : "PRIVMSG",
       "channel" : "#elk_team",
       "nick" : "es_qbox",
       "server" : "esqboxgroup.irc.slack.com:6667",
       "host" : "irc.tinyspeck.com"
     }
} ]
 }
}

As of now, we’ve been able to collect live data from Slack and index it into ES. Let’s see how we can further use the ELK stack to perform data analytics on it.

Blog Post: Why Is the Supergiant Packing Algorithm Unique? How Does It Save Me Money?

Kibana is known for its rich visualization and data analysis functionality. There are many simple use cases that we can derive from the above example. You could potentially:

  • Visualize which Slack channel is most active
  • Determine the most active user
  • Determine what day/time in the week channels are busiest

Kibana

If we configure Slack to receive build notifications, error notifications, or commit notifications, then by using Slack + ELK in combination, we can give a complete analysis of a product’s build process. The following is a snapshot of one such visualization in Kibana generated using collected Slack data:

slack-elasticsearch3.png#asset:1183

The moment ELK and Slack are integrated, there is no limit as to what you can do with it. Questions/Comments? Drop us a line below.

Related Helpful Resources

Give It a Whirl!

It's easy to spin up a standard hosted Elasticsearch cluster on any of our 47 Rackspace, Softlayer, Amazon or Microsoft Azure data centers. And you can now provision a replicated cluster.

Questions? Drop us a note, and we'll get you a prompt response.

Not yet enjoying the benefits of a hosted ELK-stack enterprise search on Qbox? We invite you to create an account today and discover how easy it is to manage and scale your Elasticsearch environment in our cloud hosting service.

comments powered by Disqus