Linux Format

Build a Slackbot....................

Chatbots are everywhere. Amazon Web Services’ Lex makes it easy to build and deploy bots to Slack, Facebook and more, as Dan Frost demonstrat­es

-

Dan Frost is back wielding his Serverless Framework like an immortal Highlander, unleashed and creating Slackbots to torment and to titillate your online enemies.

Bots have a varied presence online, from Facebook messenger to websites, the Quartz news app and on Slack. Done well, they’re helpful, can save you time and glue together existing data and infrastruc­ture.

However, building a bot can be fiddly. Thankfully, Amazon Web Services (AWS) released Lex, which is the natural language processing engine and bot system behind Alexa. It enables you to build and deploy a bot in very few lines of code. In fact, once you have Lex wired up to your chat platform iterating and adding features is extremely easy.

To get you started building interactiv­e chatbots we’re going to put together a really simple recipe-lookup bot called chefbot. We’re going to do this using AWS Lex’s platform, with Slack as the chat platform using the serverless framework.

Chefbot has the answers

All the code is on github so you can iterate on the code. Chefbot is both simple and generic enough to be a goodstarti­ng point for any bot that asks humans questions and looks up data from a MySQL (or any other database).

For this recipe you’ll need an AWS account, a Slack account, a MySQL server running somewhere and the Serverless framework (see LXF228) installed. We assume that you have admin access to the AWS and Slack accounts.

Before we dive into the code, let’s understand what we’re about to build. In user interface terms a bot is a computer that sits on a messenger platform (Slack, Facebook messenger) and interacts with humans in a conversati­onal style using text, images and other media. All the interactio­n is chronologi­cal and linear, unlike apps and web pages where the interactio­n is directed by the user.

In system architectu­re terms, a bot is piece of code that either responds to a message command or pushes a message to a human user in the hope of a follow-up command. In terms of our architectu­re here, a python function will be called each time the user sends a message in Slack. This means that we need to create an API endpoint, configure Slack to know about that endpoint and then code the endpoint to respond usefully to messages from users.

But if we’re doing anything more complex than responding to literal, perfectly matching strings then we’ll have to build a whole stack of natural language processing using machine learning. This is non-trivial, so it’s nice to offload the work onto Lex, which does this for us.

Building the Lex bot

Navigate to https://console.aws.amazon.com/lex/ home?region=us-east-1#bots and create a bot, choosing Custom Bot. Enter the bot name as chefbot, select None for voice, 5 for timeout, select No for COPPA, then click Create.

Now we create an intent which is something the user wants to get out of the bot. This might be a holiday, the answer to a question, the weather tomorrow or anything else that you can answer. For our example, it’s going to be a recipe.

The intent might require a few more details which the bot gets out of the user by asking questions, such as “What is the main ingredient?” Each of these extra details fill what Lex calls “slots”. Once the intent is clear and the slots are filled, Lex will pass both to our function which should then be able to provide a final response to the user.

First, create a few intents which reflect how our users might express their need for a recipe: Find my a recipe for fish I want to cook with fish Fish recipe

Add each of these by typing the sentence into the utterance input and clicking the + icon.

Next we need to describe the two slots of informatio­n that we need. We’ll create a dummy slot type called Ingredient

with an example value Fish. Now add one required slot called

main_ingredient of type Ingredient and make it required. Then create another slot called cooking_time of type AMAZON.

NUMBER and also make it required. Now, returning to our sample utterances we need to label which of the words in the sentences relate to our slots, since all we’re really interested in is getting the slot values. Label fish as an ingredient and the numbers as cooking_time.

(It’s possible to use NLP to turn human duration phrases into numbers, but that’s outside the scope of this tutorial. Have a look at AMAZON.DURATION and play around!)

We aren’t going to bother with a confirmati­on prompt since what we do with the intent isn’t exactly life-changing: we aren’t ordering a pizza, taking down a server or moving money between bank accounts. In those more serious situations you would have a confirmati­on like “Okay – I’m going to push the big red button. Are you sure about that?” before doing it.

For now, leave fulfilment as Return Parameters to Client, which does what it sounds like. We get the slots back so we can see if our little conversati­on worked.

To test your bot, first click Build to build it and then open the Test Bot dialog in the bottom right of the console. Type one of the sample utterances or a slight variation such as “Got a recipe for fish?” You should see that the slight variations in language are dealt with by the Lex NLP, and after answering a couple of questions you get the slot values back.

Our next task is to plumb in Slack so the same result can be seen there. After that, we’ll do something more exciting with the data.

Plumb in Slack

Getting Lex tied into Slack requires copying a few keys from Lex to Slack and vice versa. This is donkey work, but necessary so let’s get on with it…

In the Settings tab create a new alias by giving it a name – for example Beta – and selecting a version. Select the most recent version you’ve built. Click the Channels tab and then click Slack.

You’ll need some informatio­n from slack first, so open a new tab and go to https://api.slack.com and login. Click Add a Bot and then on the next screen click Add a Bot User. On the next screen set the bot name to chefbot and set Always Show my Bot as Online to On and click Add Bot User.

Click Interactiv­e Messages in the left menu and then click Enable Interactiv­e Messages. For now, just put any valid URL in the URL field – we’ll come back to this later. Now click Basic Informatio­n in the left menu and copy the values for Client ID, Client Secret and Verificati­on token from the Slack interface into the correspond­ing fields in Lex and, in Lex click Activate.

We now have two values to copy back to Slack: Postback URL, which is used for event subscripti­ons and interactiv­e messages; and oAuth URL, which is used for the oAuth handshake to authentica­te with Slack. Copy these into a text file as we’ll need them in a few places.

Go back to the Slack tab (we’re nearly done, we promise) and click oAuth Permission­s. Click Add Redirect URL and paste in the oAuth URL. Click Save. Now add the scope permission­s: `Chat:write:bot, team:read` and save changes. Click Interactiv­e Messages and copy the postback URL from Lex into the Request URL and click Save changes.

Finally (yes, really…), click Event Subscripti­ons and enable them with the toggle. Paste in the postback URL to the request URL field. In Subscribe to Team Events add message. channels, and in Subscribe to Bot Events type message.im and select the option that comes up. Save the changes.

So that’s the config done. Now we need to deploy it to Slack. In Manage Distributi­ons click Add to Slack and then Authorize on the following screen. You’re then redirected to the Slack web UI where you can test the bot. Select the bot from the left list of channels and start chatting using the utterances that you configured earlier.

(Before we go any further, if you get stuck or if the process has changed, consult the AWS documentat­ion on integratin­g Slack, http://docs.aws.amazon.com/lex/latest/dg/slackbot-associatio­n.html.)

Okay, this took some boring configurat­ion but the upshot is that you can easily message the bot and obtain a response. We now have Slack sending messages to Lex, Lex working out what we need from the user and then dumping the slots of data back to the user. The only uncool part of this is that the Lambda function isn’t doing anything very interestin­g, so let’s solve that next.

Bring our Frankenbot to life

Instead of just dumping the values back to the user, Lex can hand off to Lambda, AWS’s Serverless environmen­t. I’m going to use the Serverless framework which does lots of the complex orchestrat­ion required to use AWS, so get yourself an AWS account and install the Serverless framework and let’s bootstrap the project. We’re going to use the Python 3

environmen­tLet’s kick thingsas that’soff... our preferred language these days. npm install -g serverless serverless create --template aws-python --path MyChatBot Now create an IAM profile for yourself and set up your credential­s as follows: serverless config credential­s -p aws -k XXX -s XXXXX --profile tutorial-profile Now modify the serverless.yml file to contain the following. You can remove all the boilerplat­e config if you wish. provider: name: aws runtime: python3.6 profile: tutorial-profile … functions: handle_lookup: handler: handler.handle_lookup events: - http: path: lookup method: any Now add the handler function to handler.py: def handle_lookup(event, context): logger.info(str(event)) return { ‘sessionAtt­ributes': event['sessionAtt­ributes'], ‘dialogActi­on': { ‘type': ‘Close’, ‘fulfillmen­tState': ‘Fulfilled’, ‘message': { ‘contentTyp­e': ‘PlainText’, ‘content': ‘Look at my bot!’ } } }

And deploy: serverless deploy -v (At this stage it’s worth tailing the log in your terminal: serverless logs -t f handle_lookup.)

In the Lex UI, change fulfilment to “AWS lambda function” and select your function from the dropdown. Save the intent. Give it a whirl in slack and you should see the few slot-filling steps performed by Lex and then the final “Whoa!” response from our Python method.

Let’s finish off by making this do something interestin­g.

Pulling in some real data

We’ve created a simple MySQL dataset that you can put into any MySQL database instance. For the purposes of this, I created an AWS RDS instance, but your MySQL server can be anywhere so long as Lambda can see it. There aren’t enough words in the article to get MySQL setup and do the chat stuff

so just do what works for you (we love a challenge!–Ed). All you need is the hostname, user, password and database name and to have the DB open to the internet… (Warning: the method of opening the database to the internet is not fit for production systems. This is just for demonstrat­ion only.)

Install the MySQL connector and then the code to connect, select the records and return them. First install the package: virtualenv .myenv source .myenv/bin/activate pip install mysql-connector-python-rf Now add the following to the top of the handler file: sys.path.append(’.myenv/lib/python3.6/site-packages') import mysql.connector

The first line is because we need to bundle up all dependenci­es inside the Lambda and then include the site packages directory in our path. The second is a normal Python import statement.

Now we can get down to the task of writing the code to connect and return results. For this example, we’re using a plain MySQL connector, but you can employ whichever fancy database connector takes your fancy. def handle_lookup(event, context): logger.info(str(event)) hostname = ‘...’ username = ‘...’ password = ‘...’ database = ‘...’

main_ingredient = event['currentInt­ent']['slots']['main_ ingredient']

cooking_time = event['currentInt­ent']['slots']['cooking_ time']

cnx = mysql.connector.connect(user=username, password=password, host=hostname, database=database) cursor = cnx.cursor(buffered=True) query = ‘select * from recipes where main_ingredient = %s and cooking_time <=%s'#.format(main_ingredient, cooking_ time)

cursor.execute(query, (main_ingredient, cooking_time)) reply = “\n*Here’s what I found:*\n” for r in cursor: logger.info(r) recipe = “\n- *{}* which requires the ingredient­s: {}”. format(r[1], r[3])

reply = reply + recipe return { ‘sessionAtt­ributes': event['sessionAtt­ributes'], ‘dialogActi­on': { ‘type': ‘Close’, ‘fulfillmen­tState': ‘Fulfilled’, ‘message': { ‘contentTyp­e': ‘PlainText’, ‘content': ‘Here\'s a recipe for ' + main_ingredient + ' taking ' + cooking_time + “\n\n== " + reply } } }

Now let’s take it for a spin. We’ve loaded up the recipe database with a few entirely nonsense recipes, but the results should give you an idea of what you can do with bots. Start asking chefbot for recipes and you’ll be asked for a main ingredient and cooking time, and get recipes in response. The only piece that’s coded is the final query to the database and the response to the user. This is simple, but there’s a lot of potential.

Where to go from here

In the article we’ve configured a natural language processor, hooked up a Lambda function to react to the utterances of users and hooked in a MySQL database of recipes via a Serverless Lambda function. All this is far more configurat­ion than programmin­g, so it’s important to keep in mind what can be achieved if you take this further.

The deployment we ran had the interactio­n take place in private human-bot channels, but bots can sit in on public channels as well. This can be useful if you’re pulling in data for your team to reference such as “@ issuetrack­erbot how many open issues are there?” or “@ uptime much much downtime on server X last week”.

The interactio­n we built was also entirely text based with a text input and text output. This is nice for proof of concept, but you can also add card responses which make the process more visual and, on a platform like Facebook messenger, much more engaging. Exploring places such as Google docs, Dropbox, traffic monitoring, billing, social and other data sources can broaden the scope for what just a couple of questions to a bot can do for you and your users.

 ??  ?? Chefbot in action. Simple additions like emoji make a dry bot chat seem fun or more human to deal with.
Chefbot in action. Simple additions like emoji make a dry bot chat seem fun or more human to deal with.
 ??  ?? It’s important for people to know what your bot does. Creating a simple icon, adding a colour and a compelling short descriptio­n will get users chatting.
It’s important for people to know what your bot does. Creating a simple icon, adding a colour and a compelling short descriptio­n will get users chatting.
 ??  ?? The AWS Lex console, where you design your chatbot’s conversati­ons. Play around with new phrasings and forms of chat to keep the experience engaging.
The AWS Lex console, where you design your chatbot’s conversati­ons. Play around with new phrasings and forms of chat to keep the experience engaging.
 ??  ??
 ??  ?? Dan Frost Dan works at Cambridge Assessment, experiment­ing with technology in education. He’s a writer and explorer of ideas and technologi­es. He’s often found on Twitter (@ danfrost) dismissing new fads in computing. Until he champions them.
Dan Frost Dan works at Cambridge Assessment, experiment­ing with technology in education. He’s a writer and explorer of ideas and technologi­es. He’s often found on Twitter (@ danfrost) dismissing new fads in computing. Until he champions them.
 ??  ?? Experiment with events – we’ve only scratched the surface of the events you can use here, but it’s possible to have your Lambda function react to other data. Play around in the Slack config and see what else you can create.
Experiment with events – we’ve only scratched the surface of the events you can use here, but it’s possible to have your Lambda function react to other data. Play around in the Slack config and see what else you can create.

Newspapers in English

Newspapers from Australia