Build a Slackbot....................
Chatbots are everywhere. Amazon Web Services’ Lex makes it easy to build and deploy bots to Slack, Facebook and more, as Dan Frost demonstrates
Dan Frost is back wielding his Serverless Framework like an immortal Highlander, unleashed and creating Slackbots to torment and to titillate your online enemies.
Bots have a varied presence online, from Facebook messenger to websites, the Quartz news app and on Slack. Done well, they’re helpful, can save you time and glue together existing data and infrastructure.
However, building a bot can be fiddly. Thankfully, Amazon Web Services (AWS) released Lex, which is the natural language processing engine and bot system behind Alexa. It enables you to build and deploy a bot in very few lines of code. In fact, once you have Lex wired up to your chat platform iterating and adding features is extremely easy.
To get you started building interactive chatbots we’re going to put together a really simple recipe-lookup bot called chefbot. We’re going to do this using AWS Lex’s platform, with Slack as the chat platform using the serverless framework.
Chefbot has the answers
All the code is on github so you can iterate on the code. Chefbot is both simple and generic enough to be a goodstarting point for any bot that asks humans questions and looks up data from a MySQL (or any other database).
For this recipe you’ll need an AWS account, a Slack account, a MySQL server running somewhere and the Serverless framework (see LXF228) installed. We assume that you have admin access to the AWS and Slack accounts.
Before we dive into the code, let’s understand what we’re about to build. In user interface terms a bot is a computer that sits on a messenger platform (Slack, Facebook messenger) and interacts with humans in a conversational style using text, images and other media. All the interaction is chronological and linear, unlike apps and web pages where the interaction is directed by the user.
In system architecture terms, a bot is piece of code that either responds to a message command or pushes a message to a human user in the hope of a follow-up command. In terms of our architecture here, a python function will be called each time the user sends a message in Slack. This means that we need to create an API endpoint, configure Slack to know about that endpoint and then code the endpoint to respond usefully to messages from users.
But if we’re doing anything more complex than responding to literal, perfectly matching strings then we’ll have to build a whole stack of natural language processing using machine learning. This is non-trivial, so it’s nice to offload the work onto Lex, which does this for us.
Building the Lex bot
Navigate to https://console.aws.amazon.com/lex/ home?region=us-east-1#bots and create a bot, choosing Custom Bot. Enter the bot name as chefbot, select None for voice, 5 for timeout, select No for COPPA, then click Create.
Now we create an intent which is something the user wants to get out of the bot. This might be a holiday, the answer to a question, the weather tomorrow or anything else that you can answer. For our example, it’s going to be a recipe.
The intent might require a few more details which the bot gets out of the user by asking questions, such as “What is the main ingredient?” Each of these extra details fill what Lex calls “slots”. Once the intent is clear and the slots are filled, Lex will pass both to our function which should then be able to provide a final response to the user.
First, create a few intents which reflect how our users might express their need for a recipe: Find my a recipe for fish I want to cook with fish Fish recipe
Add each of these by typing the sentence into the utterance input and clicking the + icon.
Next we need to describe the two slots of information that we need. We’ll create a dummy slot type called Ingredient
with an example value Fish. Now add one required slot called
main_ingredient of type Ingredient and make it required. Then create another slot called cooking_time of type AMAZON.
NUMBER and also make it required. Now, returning to our sample utterances we need to label which of the words in the sentences relate to our slots, since all we’re really interested in is getting the slot values. Label fish as an ingredient and the numbers as cooking_time.
(It’s possible to use NLP to turn human duration phrases into numbers, but that’s outside the scope of this tutorial. Have a look at AMAZON.DURATION and play around!)
We aren’t going to bother with a confirmation prompt since what we do with the intent isn’t exactly life-changing: we aren’t ordering a pizza, taking down a server or moving money between bank accounts. In those more serious situations you would have a confirmation like “Okay – I’m going to push the big red button. Are you sure about that?” before doing it.
For now, leave fulfilment as Return Parameters to Client, which does what it sounds like. We get the slots back so we can see if our little conversation worked.
To test your bot, first click Build to build it and then open the Test Bot dialog in the bottom right of the console. Type one of the sample utterances or a slight variation such as “Got a recipe for fish?” You should see that the slight variations in language are dealt with by the Lex NLP, and after answering a couple of questions you get the slot values back.
Our next task is to plumb in Slack so the same result can be seen there. After that, we’ll do something more exciting with the data.
Plumb in Slack
Getting Lex tied into Slack requires copying a few keys from Lex to Slack and vice versa. This is donkey work, but necessary so let’s get on with it…
In the Settings tab create a new alias by giving it a name – for example Beta – and selecting a version. Select the most recent version you’ve built. Click the Channels tab and then click Slack.
You’ll need some information from slack first, so open a new tab and go to https://api.slack.com and login. Click Add a Bot and then on the next screen click Add a Bot User. On the next screen set the bot name to chefbot and set Always Show my Bot as Online to On and click Add Bot User.
Click Interactive Messages in the left menu and then click Enable Interactive Messages. For now, just put any valid URL in the URL field – we’ll come back to this later. Now click Basic Information in the left menu and copy the values for Client ID, Client Secret and Verification token from the Slack interface into the corresponding fields in Lex and, in Lex click Activate.
We now have two values to copy back to Slack: Postback URL, which is used for event subscriptions and interactive messages; and oAuth URL, which is used for the oAuth handshake to authenticate with Slack. Copy these into a text file as we’ll need them in a few places.
Go back to the Slack tab (we’re nearly done, we promise) and click oAuth Permissions. Click Add Redirect URL and paste in the oAuth URL. Click Save. Now add the scope permissions: `Chat:write:bot, team:read` and save changes. Click Interactive Messages and copy the postback URL from Lex into the Request URL and click Save changes.
Finally (yes, really…), click Event Subscriptions and enable them with the toggle. Paste in the postback URL to the request URL field. In Subscribe to Team Events add message. channels, and in Subscribe to Bot Events type message.im and select the option that comes up. Save the changes.
So that’s the config done. Now we need to deploy it to Slack. In Manage Distributions click Add to Slack and then Authorize on the following screen. You’re then redirected to the Slack web UI where you can test the bot. Select the bot from the left list of channels and start chatting using the utterances that you configured earlier.
(Before we go any further, if you get stuck or if the process has changed, consult the AWS documentation on integrating Slack, http://docs.aws.amazon.com/lex/latest/dg/slackbot-association.html.)
Okay, this took some boring configuration but the upshot is that you can easily message the bot and obtain a response. We now have Slack sending messages to Lex, Lex working out what we need from the user and then dumping the slots of data back to the user. The only uncool part of this is that the Lambda function isn’t doing anything very interesting, so let’s solve that next.
Bring our Frankenbot to life
Instead of just dumping the values back to the user, Lex can hand off to Lambda, AWS’s Serverless environment. I’m going to use the Serverless framework which does lots of the complex orchestration required to use AWS, so get yourself an AWS account and install the Serverless framework and let’s bootstrap the project. We’re going to use the Python 3
environmentLet’s kick thingsas that’soff... our preferred language these days. npm install -g serverless serverless create --template aws-python --path MyChatBot Now create an IAM profile for yourself and set up your credentials as follows: serverless config credentials -p aws -k XXX -s XXXXX --profile tutorial-profile Now modify the serverless.yml file to contain the following. You can remove all the boilerplate config if you wish. provider: name: aws runtime: python3.6 profile: tutorial-profile … functions: handle_lookup: handler: handler.handle_lookup events: - http: path: lookup method: any Now add the handler function to handler.py: def handle_lookup(event, context): logger.info(str(event)) return { ‘sessionAttributes': event['sessionAttributes'], ‘dialogAction': { ‘type': ‘Close’, ‘fulfillmentState': ‘Fulfilled’, ‘message': { ‘contentType': ‘PlainText’, ‘content': ‘Look at my bot!’ } } }
And deploy: serverless deploy -v (At this stage it’s worth tailing the log in your terminal: serverless logs -t f handle_lookup.)
In the Lex UI, change fulfilment to “AWS lambda function” and select your function from the dropdown. Save the intent. Give it a whirl in slack and you should see the few slot-filling steps performed by Lex and then the final “Whoa!” response from our Python method.
Let’s finish off by making this do something interesting.
Pulling in some real data
We’ve created a simple MySQL dataset that you can put into any MySQL database instance. For the purposes of this, I created an AWS RDS instance, but your MySQL server can be anywhere so long as Lambda can see it. There aren’t enough words in the article to get MySQL setup and do the chat stuff
so just do what works for you (we love a challenge!–Ed). All you need is the hostname, user, password and database name and to have the DB open to the internet… (Warning: the method of opening the database to the internet is not fit for production systems. This is just for demonstration only.)
Install the MySQL connector and then the code to connect, select the records and return them. First install the package: virtualenv .myenv source .myenv/bin/activate pip install mysql-connector-python-rf Now add the following to the top of the handler file: sys.path.append(’.myenv/lib/python3.6/site-packages') import mysql.connector
The first line is because we need to bundle up all dependencies inside the Lambda and then include the site packages directory in our path. The second is a normal Python import statement.
Now we can get down to the task of writing the code to connect and return results. For this example, we’re using a plain MySQL connector, but you can employ whichever fancy database connector takes your fancy. def handle_lookup(event, context): logger.info(str(event)) hostname = ‘...’ username = ‘...’ password = ‘...’ database = ‘...’
main_ingredient = event['currentIntent']['slots']['main_ ingredient']
cooking_time = event['currentIntent']['slots']['cooking_ time']
cnx = mysql.connector.connect(user=username, password=password, host=hostname, database=database) cursor = cnx.cursor(buffered=True) query = ‘select * from recipes where main_ingredient = %s and cooking_time <=%s'#.format(main_ingredient, cooking_ time)
cursor.execute(query, (main_ingredient, cooking_time)) reply = “\n*Here’s what I found:*\n” for r in cursor: logger.info(r) recipe = “\n- *{}* which requires the ingredients: {}”. format(r[1], r[3])
reply = reply + recipe return { ‘sessionAttributes': event['sessionAttributes'], ‘dialogAction': { ‘type': ‘Close’, ‘fulfillmentState': ‘Fulfilled’, ‘message': { ‘contentType': ‘PlainText’, ‘content': ‘Here\'s a recipe for ' + main_ingredient + ' taking ' + cooking_time + “\n\n== " + reply } } }
Now let’s take it for a spin. We’ve loaded up the recipe database with a few entirely nonsense recipes, but the results should give you an idea of what you can do with bots. Start asking chefbot for recipes and you’ll be asked for a main ingredient and cooking time, and get recipes in response. The only piece that’s coded is the final query to the database and the response to the user. This is simple, but there’s a lot of potential.
Where to go from here
In the article we’ve configured a natural language processor, hooked up a Lambda function to react to the utterances of users and hooked in a MySQL database of recipes via a Serverless Lambda function. All this is far more configuration than programming, so it’s important to keep in mind what can be achieved if you take this further.
The deployment we ran had the interaction take place in private human-bot channels, but bots can sit in on public channels as well. This can be useful if you’re pulling in data for your team to reference such as “@ issuetrackerbot how many open issues are there?” or “@ uptime much much downtime on server X last week”.
The interaction we built was also entirely text based with a text input and text output. This is nice for proof of concept, but you can also add card responses which make the process more visual and, on a platform like Facebook messenger, much more engaging. Exploring places such as Google docs, Dropbox, traffic monitoring, billing, social and other data sources can broaden the scope for what just a couple of questions to a bot can do for you and your users.