Sketching a Slack Slash Parliamentary Auto-Responder Using AWS Lambda Functions

Across a couple of recent posts, I’ve explored how to use a webhook manager to implement the a simple Slack bot that handles queries from Slack and return information from the UK Parliament data API (Searching the UK Parliament API from Slack Slash Commands Using a Python Microservice via Webhooks) and how to use AWS Lambda functions to construct a simple Slack slash command responder (Implementing Slack Slash Commands Using Amazon Lambda Functions – Getting Started).

So this morning, I thought I’d have a go at getting a Slack slash command responder using AWS Lambda functions to handle a couple more queries. Here’s where I got to…

First up, asking for committees that a member of parliament sits on:


Secondly, a query on who the current members of a particular committee are:


One rationale for supporting this sort of query is to provide fingertips information access to a researcher through a unified conversational interface.

To trigger the responses, I’ve used a regular expression that tries to capture several different question types:

x='committees that Andrew Turner is on"
regexp=re.compile(r'.*(?:committees[ (?:that|does|is)]*) (.*?)(( (:?is )?(on|sits on|sit on|a member of))|$)')

Obviously, this is not very advanced in terms of natural language processing, but the domain is a simple one and the number of forms that a query requesting this sort of information might take will probably be quite simplistic – and predictable!

Having extracted the member’s name (for a lookup of the committees a member is on) or the committee name when trying to look up the members of a particular committee), a URL is generated that can request the data from the Parliament members API. For example:

def committee_URL(c):
    return comm_url.format(urlencode(urlargs))

We can this use this URL to get some JSON data back:

def getJSON(url):
    q = Request(url)
    q.add_header('accept', 'application/json')
    r= urlopen(q)
    return json.loads(a.decode('utf-8-sig'))

The next step is to parse the JSON response to pull out the information we want, and convert it to a simple text string:

def committeeMembers(members,c):
    if members['Members'] is None: return None
    for m in members['Members']['Member']:
        tl.append('{} ({})'.format(m['FullTitle'],m['Party']['#text']))
    return 'Members of the {}: {}'.format(c,', '.join(tl))

This text string can then be returned to Slack as the slash command response.

[UPDATE…] Here’s another example… The members’ API can look up MP by location, constituency or postcode; so if we try them in turn, we can take in a wide variety of location styles; and it only takes a really simple regular expression to prime the pattern match for what I guess is a wide range of possible conversational gambits for requesting this information:


As with the members API, the Parliament data API will also return JSON responses to valid queries (I used the Parliament data API in the original demo). There’s quite a few APIs to play with – datasets so as and when I get a moment, I may try to code some more of them up as conversational responders:-)

Implementing Slash Commands Using Amazon Lambda Functions – Encrypting the Slack Token

In an earlier post, Implementing Slack Slash Commands Using Amazon Lambda Functions – Getting Started, I avoided the use of an encrypted Slack token to identify the provenance of an incoming request in favour of the plaintext version to try to simplify the “getting started with AWS Lambda functions” aspect of that recipe. In this post, I’ll describe how to to step up to the mark and use the encrypted token.

Although I tried to limit myself to free tier usage, an invoice from Amazon made me realise that there’s a cost associated with generating and subscribing to AWS encryption keys of $1 per month…
To begin with, you’ll need to create an AWS encryption key. The method is described here but I’ll walk you though it…

The is generated from the IAM console – select the Encyrption Keys element from the left hand sidebar, and then make sure you select the correct AWS region (that is, the region that the Lambda function is defined in) before creating the key:


Check again that you’re in the correct region, and then give your key an alias (I used slackslashtest):


You then need to set various permissions for potential users of the encryption key. I avoided giving anyone administrative permissions:


but I did give usage permissions to the role I’d defined to execute my Lambda function:


Once you’ve assigned the roles and defined the encryption key, you should be able to see it from the IAM Encryption Keys console listing:


Select the encryption key and make a copy of the ARN that identifies it:


You now need to add the ARN for this encryption key to a policy that defines what the role used to execute the Lambda function can do. From the IAM console, select Roles and then the role you’re interested in:


Create a new role policy for that role:


You can use the policy generator tool to create the policy:


Select the AWS Key Management Service, and then select the Decrypt action. This will allow the role to use the decrypt method for the specified encryption key:


Add the ARN for your encryption key (the one you copied above) and select Add Statement to add the decrypt action on the specified encryption key to the newly created role policy.


You can now generate and review the policy – you may want to give it a sensible name:


So… we’ve now created a key, with the alias slackslashtest, and given the role that executes the Lambda function permission to access it as part of the encryption key definition; we’ve then declared access to the Decrypt method via the role policy definition.

Now we need to use the encryption key to encrypt our Slack token. You can do this using the Amazon CLI (Command Line Interface). To do this, you first need to install the AWS CLI on your computer. (I think I did this on a Mac using Homebrew? I’m not sure if there’s an online console way of doing the encryption?)

Once the AWS CLI is installed, you need to configure it. To do this, you need to get some more keys. From the IAM console, select Users and then your user. You now need to Create Access Key.


Creating an access key is fraught with risk – you get one opportunity to look at the key values, and one opportunity to download the credentials, and that’s it! So make a note of the values…


You’re now going to use these access keys to set up the AWS CLI on your computer (you should only need to do this once). After ensuring that the AWS CLI is installed, (enter the command aws on the command line and see if it takes!), run the command aws configure and provide your access key credentials. Also make sure you select the region you want to work in.


Having configured the CLI with permission to talk to the AWS servers, you can now use it to encrypt the Slack token. Run the command:

aws kms encrypt --key-id alias/YOUR_KEY_ALIAS --plaintext "YOUR_SLACK_TOKEN"

using approriate values for the AWS encryption key alias (mine was slackslashtest) and Slack token. This calls the key encryption service and uses the specified encryption key, via its alias, to encrypt the plaintext string.


The CiphertextBlob is the encrypted version of the token. In your AWS Lambda function definition, you can use this value as the encrypted expected token from Slack that checks the provenance of whoever’s made a request to the Lambda function:


Comment out – or better, delete! – the original plaintext version of the Slack token that we used as a shortcut previously, and save the Lambda function.

Now when you call the Lambda function from Slack, via the slash function, it should run as before, only this time the Slack token lookup is made against an encrypted, rather than plaintext, version of it on the AWS side.

In the final post of this short series, I’ll describe how to write a simple test event to test the Lambda function.

Implementing Slack Slash Commands Using Amazon Lambda Functions – Getting Started

The cloud, it seems, is everywhere nowadays. One way I find it useful to classify the offerings is the following crude categorisation:

  • applications, such as Google Docs or Gmail;
  • infrastructure, such as the AWS (Amazon Web Services) S3 storage service or the EC2 compute service (virtual servers and containers);
  • services, such as the AWS Simple Queue Service (SQS) or Lambda functions

Other ways of categorising offerings are available too; for example, AWS divvy up their offerings as follows:


Having recently just signed back into the AWS world, I thought I’d start to try out some of the first year free tier offerings. So for this first bit of toe dipping into the AWS ocean, I thought I’d see if I could make use of Amazon Lambda functions – “serverless” computational functions executed by AWS – to implement something akin to the Slack slash command handler I described in the previous post.

In that previous post, I described how I used a Slack /slash pattern that takes an HTTP POST request from a Slack slash extension to call out to a microservice on; that service responds to an incoming callback extension on Slack. The microservice itself also makes a query request to a third party search API. The architecture looks something like this (though I wonder if I could have simplified it by just responding to the slash command request, rather than returning the response via the Slack incoming extension?):


Amazon Lambda functions work in a similar way to the way handles the compute function definition and its execution, but the invocation needs to come either from an event triggered by another AWS source or over HTTPS using an event raised by the Amazon API Gateway (AWS Lambda Function and Event Sources). That is, we need a pattern that looks more like this (though I haven’t tried the call out to the UK Parliament API yet):


A recent post on the AWS blog – New – Slack Integration Blueprints for AWS Lambda – described a simple blueprint for implementing a simple “echo” slash command handler running on AWS. Excellent – it took me less than half an hour to hack together the thing, so I was hoping for the same with AWS.


That was this morning, well before coffee, and now it’s after lunch. Having got it working, it’s a simple five minute job. but it took me a couple of hours to find the 5 minute route. (Trying to follow notes on the web is one reason I blog the way I do, and why I have such high regard (honestly!) for the majority of OU materials. Recalling the times when I used to work through through maths texts, too many tutorials have a “hence” or “just” step that may be obvious to an expert, but is a huge blocker to a novice…)

So here’s the five minute version (maybe fifteen!;-), containing pictures with boxes and arrows and a paragraph associated with each one to describe what’s going on…

Step the first

You need an Amazon AWS account – you means handing over your credit card. That said, when you sign up you get access to to the free tier for a year. You may even get additional credit if you sign up via the Github Student Developer Pack.

Step the second

Go to the AWS Lambda console (you may want to change region – I’m going via Ireland) and get started…



Step the third

We can make use of the simple template for the slack echo command using Python.


Step the fourth

In this step, we start naming things. Names are important, because we’ll be calling things by name to invoke them; you need to keep track of what’s called what and where so that you can make sure you’re calling it properly.

The first thing you need to do is give your Lambda function a name, I’m calling mine simpletest. This is effectively a filename – for the python function I’m creating, we can think of this setting as saving the filename of the local/inline copy of the function code to


The second thing you need to check is the name of the function in the code you want to invoke when the lambda function is called. In the example code, this is the function lambda_handler().

The third thing you need to check is the name of the handler that will be executed when the Lambda function is triggered. This is the function-in-the-file we want to run in the form FILENAME.FUNCTION. In this example, simpletest.lambda_handler.

Step the fifth

Define the Lambda function role.The suggested role is a “Basic execution role”. On first run you won’t have one of these, so you’ll need to create one (your browser will possibly need pop-ups enabling).


Step the Sixth

If you now look at the guidance given in the example Lambda function code, it starts off with the following:

Follow these steps to configure the slash command in Slack:
1. Navigate to https://<your-team-domain>
2. Search for and select "Slash Commands".
3. Enter a name for your command and click "Add Slash Command Integration".
4. Copy the token string from the integration settings and use it in the next section.
5. After you complete this blueprint, enter the provided API endpoint URL in the URL field.

This is all good advice. Except for the use it in the next section bit, because we’re going to ignore that for now.

Step the Seventh – just don’t…

In the guidance, steps are described for encrypting the token you got from the Slack slash definition page. This is Good Practice, but a real pain if you’re just trying to get started and what to check things are working in the first place because you’ll quite possibly  end up going down various ratholes. (I’ll describe what you need to do to follow those steps in another post.)

So for the instructions that begin:

Follow these steps to encrypt your Slack token for use in this function:

just ignore them. Instead, edit the code, comment out the encrypted token handler bits, and paste in a plaintext version of the token you got from Slack. (We’re just trying stuff out, remember… we can reset the token and move to an encrypted one once we know the other bits are working).

#ENCRYPTED_EXPECTED_TOKEN = &quot;&lt;kmsEncryptedToken&gt;&quot; # Enter the base-64 encoded, encrypted Slack command token (CiphertextBlob)

#kms = boto3.client('kms')
#expected_token = kms.decrypt(CiphertextBlob = b64decode(ENCRYPTED_EXPECTED_TOKEN))['Plaintext']

expected_token ='YOUR_SLACK_TOKEN'

Step the Eighth

The next step of guidance (the bit beginning Follow these steps to complete the configuration of your command API endpoint) refer to what happens on the next step – which I’ll walk through…

Click on Next from the function definition page, and start to configure the API endpoint, specifically setting the Method to POST and the Security to Open. (You might also want to change the name of the API to something more appropriate, perhaps away from LambdMicroservice and towards something more personally recognisable, such as slacktestservice.) Leave the deployment stage set to prod.



Step the Ninth

Move on to the next step, and you can create your Lambda function:



But…. we’re still not there yet….

Step the Tenth

…there’s still stuff to do with the API definition. From the API Endpoints tab, you need to go into the prod deployment stage settings:

Lambda_Management_Console_4This will allow us to tweak the way that the API handles requests made to it.

Step the Eleventh

From the API Gateway console, select the service we associated with the Lambda function, which by default was called LambdMicroservice; (if you renamed the service, for example to slacktestservice, click on that service.



Step the Twelfth

Select the simpletest function, and click on the POST method. This shows the steps associated with the call handler. Click on the Integration Request setting.


We  now need to set the API service up so it can handle the Slack POSTed content.

Step the Thirteenth

The Integration Request needs customising to handle the JSON data sent from Slack. To do this we need to create Mapping Template for the JSON content.


So create one…

Step the Fifteenth

The mapping we need to make is from the accepted application/x-www-form-urlencoded type. (Note, the official guidance currently (incorrectly) sets this as x-www-form-urlencoded).



Step the Sixteenth

Select the Mapping template, and define the template as follows: 

{"body": $input.json("$")}


Accept the template setting.

Step the Seventeenth

Having defined the mapping template, deploy the API.


Make sure you deploy to the correct place (recall, we were using prod)!


Step the Eighteenth

From the Lambda function control panel, you should be able to see the URL for your API endpoint. Grab a copy of this URL.


Step the Nineteenth

Paste the API endpoint URL – making sure it points to the correct function handler (in my case, simpletest).


Make sure you save/update the settings!

Step the Twentieth

Finally, you should be able to try out your Slack slash command…



Phew… got there eventually, albeit insecurely… In a later post, I’ll describe how to do the token encryption bit, because for an AWS n00b it again takes multiple, and not necessarily obvious, steps… I’ll also describe how to set up a simple test case for testing out the function.

PS If I’ve missed anything out in this tutorial, please let me know. I’d only intended to spend half and hour or so tinkering and half and hour blogging this, and it’s now getting on for six hours after I started, though a fair chunk of that time was also spent putting this post together … So if I can spare anyone else the pain…!;-)

Searching the UK Parliament API from Slack Slash Commands Using a Python Microservice via Webhooks

Several years ago now, I remember being excited by the idea of webhooks which provided a simple callback mechanism for executing remote microservice commands on the web via an HTTP request. For whatever reason, webhooks never really became part of my everyday toolkit, but with Amazon Lambda functions coming to my attention again recently as Google experimented with a rival service, I’ve started looking at them again.

To get back into the swing of how these things work, I thought I’d tried to put together a Python simple script that could run a search query against a data collection from the UK Parliament API; the request would be triggered from a slash command in Slack.

On the Slack side, you need to define a couple of custom integrations:

  • Slash Command that will define the name of the command that you want to handle and provide a callback URL that should be accessed whenever the slash command is issued;
  • an Incoming Webhook that  provides a callback URL on Slack that can handle a response from the microservice accessed via the slash command callback.


The slash command is declared and the URL of the service to be accessed needs to be specified. To begin with, you may not have this URL, so it can be left blank to start with, though you’ll need to add it in when you get your callback service address. When the callback URL is requested, a token is passed along with any extra text from the slash command string. The callback service can check this token against a local copy of the token to check that the request has come from a known source.


The incoming webhook creates an endpoint that the service called from the slash command can itself callback to, providing a response to the slash command message.


To handle the slash command, I’m going to develop a simple microservice on using Python 2.7. To begin with, I’ll define a couple of hidden variables that I can access as variables from my callback script. These are a copy of the token that will be issued as part of the slash command request (so I can verify that the service request has come from a known source) and the Slack incoming webhook address.


I can access these environment variables in my script as elements in the Hook['env'] dict. The data package from Slack can be accessed via the Hook['params'] dict.

The service definition begins with the declaration of the service name, which will provide a stub for the callback URL. The automatically generated Home URL is the one that needs to be provided as the callback URL in the Slack slash command configuration.


The code for the service can be specified locally, or pulled in from a (public) gist.


In the service I’ve defined, I make a request over the Parliamentary research briefings dataset (others are available) and return a list of keyword matching briefings as well as links to the briefing document homepage.

import json, re
import urllib2
from urlparse import urlparse
from urllib import urlopen, urlencode

class UKParliamentReader():

    Chat to the UK Parliament API

    def __init__(self):
        """ Need to think more about the structure of this... """

    def qpatch(self,query):
        for a in query.split('or'):
            t.append('({})'.format(a.strip().replace(' ',' AND ')))
        return ' OR '.join(t)

    def search_one(self,query, typ='Research Papers',page=0,ps=100):
        url='{}?{}'.format(url, urlencode(urlargs))
        data =json.loads(urlopen(url).read())
        for i in data['result']['items']:
            response.append("{} [{}]".format( i['title'],i['identifier']['_value']))
        return response

    def search_all(self,query, typ='Research Papers',ps=100):

    def responder(self,hook):
        r2="; \n".join(r)
        payload={"channel": "#slashtest", "username": "parlibot",
                 "text":"I know about the following Parliamentary research papers:\n\n {}".format(r2)}
        req = urllib2.Request(ukparl_url)
        req.add_header('Content-Type', 'application/json')
        response = urllib2.urlopen(req, json.dumps(payload))

if Hook['params']['token'] == Hook['env']['ukparl_token']:

With everything now set up, I can make use of the slash command:


Next up, I’ll see if I can work out a similar recipe for using Amazon AWS Lambda functions…

See also: Chatting With ONS Data Via a Simple Slack Bot

NOTE: this recipe was inspired by the following example of using to create a Javascript powered slash command handler: Making custom Slack slash commands with

Chatting With ONS Data Via a Simple Slack Bot

A recent post on the ONS Digital blog – Dueling with datasets – describes some of the design decisions taken when putting together the new Office for National Statistics website (such as having a single page for a particular measure that would provide the current figures at the top as well as historical figures further down the page) and some of the challenges still facing the team (such as the language and titling used to describe the statistics).

The emphasis is still very much on publishing the data via a website, however, which promotes two particular sorts of interaction style: browse and search. Via Laura Dewis (Deputy Director, Digital Publishing at Office for National Statistics, and ex- of the OpenLearn parish), I got a peek at some of the popular search terms used on the pre-updated website, which suggest (to me) a mix of vernacular keyword search terms as well as official terms (for example, rpi, baby names, cpi, gdp, retail price index, population, Labour Market Statistics unemployment, inflation, labour force survey).

Over the last couple of years, regular readers will have noticed that I’ve been dabbling with some simple data2text conversions, as well as dipping my toes into some simple custom slackbots (that is, custom slack robots…) capable of responding to simple parameterised queries with texts automatically generated from online data sources (for example, querying the Nomis JSA figures as part of a Slackbot Data Wire, Initial Sketch or my First Steps in a Conversational Slackbot interface to CQC Inspection Data ).

I’m still fumbling around how best to try to put these bots together. On the one hand is trying to work out what sorts of thing we might want to ask of the data, as well as how we might actually ask for it in natural language terms. On the other, is generating queries over the data, and figuring out how to provide the response (creating a canned text around the results from a data query).

But what if there was already a ready source of text interpreting particular datasets that could be used as the response part of a conversational data agent? Then all we’d have to focus on would be parsing queries and matching them to the texts?

A couple of weeks ago, when the new ONS website came out of beta, the human facing web pages were complemented with a data view in the form of JSON feeds that mirrored the HTML text (I don’t know if the HTML is actually generated from the JSON feeds?), as described in More Observations on the ONS JSON Feeds – Returning Bulletin Text as Data. So here we have a ready source of data interpreting text that we may be able to use to provide a backend to a conversational UI to the ONS content. (Whether or not the text is human generated or machine generated is irrelevant – though it does also provide a useful model for developing and testing my own data to text routines!)

So let’s see… it being to wet to go and dig the vegetable patch yesterday, I thought I’d have a quick play trying to put together some simple response rules, in part building on some of the ONS JSON parsing code I started putting together following the ONS website refresh.

Here’s a snapshot of where I’m at…

Firstly, asking for a summary of some popular recent figures:


The latest figures are assumed for some common keyword driven queries. We can also ask for a chart:


The ONS publish different sorts of product that can be filtered against:


So for example, we can run a search to find what bulletins are available on a particular topic:


(For some reason, the markdown isn’t being interpreted as such?)

We can then go on to ask about a particular bulletin, and get the highlights from it:


(I did wonder about numbering the items in the list, retaining the state of the previous response in the bot, and then allowing an interaction along the lines of “tell me more about item 3”?)

We can also ask about other publication types, but I haven’t checked the JSON yet to see whether it makes sense to handle the response from those slightly differently:


At the moment, it’s all a bit Wizard of Oz, but it’s amazing how fluid you can be in writing queries that are matched by some very simple regular expressions:


So not bad for an hour or two’s play… Next steps would require getting a better idea about what sorts of conversation folk might want to have with the data, and what they actually expect to see in return. For example, it would be possible to mix in links to datafiles, or perhaps even upload datafiles to the slack channel?

PS Hmm, thinks.. what would a slack interface to a Jupyter server be like…?

First Steps in a Conversational Slackbot interface to CQC Inspection Data

A few months ago, I started to have a play with ratings data from the CQC – the Care Quality Commission. I don’t seem to have posted the scraper tools I ended up with anywhere, but I have started playing with them again over the last couple of weeks in the context of my slackbot explorations.

In particular, I’ve started working about a conversational workflow for keeping track of inspections in a particular local area. At the current time, the CQC website only seems to support alerts relating to new reports at the level of a particular location, although it is possible to get faceted search results relating to reports published over the course of the last week or last month. For the local journalist wanting to keep tabs on reports associated with local providers, this means setting up a data beat that includes checking the CQC website on a regular basis, firstly to see whether there are any new reports, and secondly to see what sort of reports they are.

And as a report from OnTheWight today shows (Beacon Health Centre rated ‘Good’ by CQC), this can include good news stories as well as bad news one.


So what’s the thing I’ve been exploring in the slack context? A time saver, hopefully. In the first case, it provides a quick way of checking up on reports from the local area released over the previous week or previous month:

To begin with, we can ask for a summary report of recent inspections:


The report does a bit of counting – to provide some element of context – and provides a highlights statement regarding the overall result from each of the reports. (At the moment, I don’t sort the order in which reports are presented. There are opportunities here for prioritising which results to show. I also need to think about wether results should be provided as a single slackbot response, as is currently the case, or using a separate (and hence, “star-able”) response for each report.)

After briefly skimming over the recent results, we can tunnel down into a slightly more detailed summary of the report by passing in a location ID:


As part of the report, this returns a link to the CQC website so we can inspect the full result at source. I’ve also got a scraper that pulls the full reports from the CQC site, but at the moment it’s not returning the result to slack (I think there’s a message size limit which I’m overflowing, so I need to find what that limit it and split up the response to get round it.). That said, slack is perhaps not the best place to return long reports? Maybe a better way would be to submit quoted report components into a draft WordPress blog post?

We can also pull back administrative information regarding a location from its location ID.


This report also includes information about other locations operated by the same provider. (I think I do have a report somewhere that summarises the report ratings over all the locations associated with a given provider, so we can look to see how well other establishments operated by the same provider are rated, but I haven’t wired that into the Slack bot yet.)

There are several other ways we can develop this conversation…

Company number and charity number information is available for some providers, which means it should be to trivial to ask return company registration information and company directors information from Companies House or OpenCorporates, and perhaps even data from the Charities Commission.

Rather more scruffily, perhaps, we could use location name and postcode to try a search on the Food Standards Agency website to see if we can find food ratings data for establishments of interest.

There might also be opportunities for linking in items from local spending data to try to capture local authority spend with a particular location or provider. This would be simplified if council payments to CQC rated establishments or providers included the CQC location or provider ID, but that’s perhaps too much to ask for.

If nothing else, however, this demonstrates a casual conversational way in which a local journalist might be able to use slack as part of a local data beat to run regular, periodic checks over recent reports published by the CQC relating to local care establishments.