a

Lorem ipsum dolor sit amet, consectetur adicing elit ut ullamcorper. leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet. Leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet.

  /  Project   /  Blog: Emergency Chatbot using Rasa on Jupyter Notebook/Google Colaboratory

Blog: Emergency Chatbot using Rasa on Jupyter Notebook/Google Colaboratory


Chatbots are tipical artificial intelligence tools, widely spread for commercial purposes.

With this article my goal is to explain the purpose of Emergency Chatbot, how to develop an idea into an empty bot as Rasa Stack with strenghts and weaknesses and why I’ve used Jupyter Notebook to work out it.

During the summer of 2018 me with other two guys we had the idea to build a chatbot, though after an initial collaboration for some months, I’ve gone ahead alone on this project with Rasa. My idea of Emergency Chatbot was born after bad weather tragedies happened in Italy during the summer. In these situations people needs an urgent help and in the same time they are in panic conditions. So first of all Emergency Chatbot would be a social tool employed in dangerous situations and exploiting technology would being useful in saving lives timely. Maybe there could be an opportunity to deploy this Emergency Chatbot in an app sold together an insurance coverage.

We started coding with python some scripts, but after that we discovered Rasa and the decision was to use this framework because Rasa is an open-source NLP toolkit written in python with a pre-built architecture to customize as you want. Rasa Stack is built for short messages and is able to perform tasks, so with one idea you can start to fill in this framework with sentences that form a dataset used to train your bot, then you configure NLU model, create dialogue patterns and a skeleton of the dialogues.

For this project I’ve followed “Justina Petraityte’s” Jupyter Notebook steps. The choice to use a Jupyter Notebook is coming from that it’s helpful for sharing jobs between data scientists, though it’s not possible to use it to deploy the bot in production, but is possible to share the notebook on web and run it on the cloud. There are several tools but Google Colaboratory performs well and it’s my best choice.

To understand how the bot works is necessary to know the basic flow of the conversation: the user gives an input, the agent parse this input running a NLP task and the agent gives an answer to the user based on the entity extracted from the conversation.

Intent maps user input to responses. It can be viewed as one dialog turn within the conversation. In each intent, are defined examples of user utterances, entities and how to respond.

Entities are keywords used to identify and extract useful data from inputs. While intents allow your bot to understand the motivation behind a particular user input, entities are used to pick out specific pieces of information that your user mention including entity values linked to the entities

At this point the question is, how does Rasa stack work?

Rasa framework is split into Rasa NLU and Rasa Core python libraries. The first one is the natural language understanding module used for intent classification and entity extraction with the aim to teach the chatbot how to understand user inputs based on machine learning.

With Rasa Core you teach the chatbot how to make responses by training dialogue management set up on deep learning.

When an user input is coming it is passed to the intepreter (Rasa NLU) with the goal to extract intents, entities and others structured information. Next steps are managed by Rasa Core. The tracker is used to store the conversation history in memory, it maintains conversation state. The policy decides what action to take at every step in a dialogue. It is generated along with a featurizer, which creates a vector representation of the current dialogue state given by the tracker. At the end an action is executed sending a message to the user and passing a tracker instance.

Let’s start to the journey of Emergency Chatbot!!!

Look at Rasa_Emergency_Chatbot_Colab Notebook.

First of all let’s install Rasa libraries: I’ve used Rasa NLU 0.12.3 and Rasa Core 0.9.6, though in the meantime updates versions have been implemented.

Now we are ready to start with Natural Language Understanding process using a dataset saved on “nlu.md” file (“##” stands for the beginning of an intent).

In this dataset user input examples are grouped by intent. In Emergency Chatbot the dataset contains the followed intents:

“greet”, ”affirm”, ”emergency_air_rescue”, ”emergency_insurance_number”, ”emergency_road_help”, ”emergency_fire”, ”emergency_ambulance”, ”emergency_police”.

With six entities: “Police”, “Road_help”, “Ambulance”, “Air_rescue”, “Fire”, “Insurance”.

Entities and entities values (synonymous) are labeled using the markdown link syntex: [entity value](entity_type).

In the Rasa NLU model incoming messages are processed by a sequence of components. These components are executed one after another in a so-called processing pipeline. There are components for entity extraction, for intent classification, pre-processing, and others. Each component processes the input and creates an output.

In the file “config.yml” is embedded the processing pipeline: Spacy library provides a tokenization and parts of speech, with Spacy featuriser looks up a GloVe vector for each token and pools these to generate a representation of the whole sentence. Scikit-learn classifier trains a support vector machine model and the ner_crf trains a conditional random field to recognise the entities. Support Vector Machines are used because they are a good classifier for short messages as demonstrated by Kaggle competition: “What’s Cooking?”.

The model is evaluated by a confusion matrix and the performance is tested with a new sentence.

At this point Rasa Core is used to teach the chatbot on how to make responses using a deep learning model: LSTM neural network implemented in Keras.

In this step two files are relevant: “stories.md” and “domain.yml”.

The first one represents the training data for the dialogue management model: a draft of actual conversations with intents and entities for the user and actions for the bot. In a typical story:

“##” stands for the beginning of a story;

“*” stands for the messages sent by the user in the form of intents;

“-” stands for actions taken by the bot.

Actions are expressions that your bot runs as answers to the user inputs. In Rasa Core there are three types of actions:

default actions (action_listen, action_restart, action_default_fallback);

utter actions, starting with “utter_”, used as message sent to the user input;

custom actions (any other action), they can be used with arbitrary code.

In Emergency Chatbot with an API is created a custom action that affords the chatbot to retrieve an answer for the ambulance entity, for the police entity or for the fire entity depending on the user inputs. The chatbot will know which type of answer should be given by retrieving the value from the slot ‘group’.

The “domain.yml” represents the skeleton of the conversation, it has the expected user inputs, the actions that should be predicted by the bot, how to respond and what information to store.

The domain is composed by five parts: intents, slots, entities, actions and templates.

In templates there are messages linked to the actions of the bot.

Slots represent the bot’s memory, they store key-values gathered by the user or by outside world. In Emergency Chatbot slots are set by returning events in actions; categorical slots represented by entities with their values.

Now the dialogue management model will be trained creating an agent and including policies to decide which action to take at every step in the conversation.

The model implemented to train histories are Long Short Term Memory (LSTM) a fancy Recurrent Neural Networks, which basically extends their memory. RNN’s are the first algorithm that remembers its input, due to an internal memory, which makes it perfectly suited for problems with sequential data. They are used by Apples Siri and Googles Voice Search.

After 10–15 minutes of training (depending by the use or not of the GPU) you are able to play with Emergency Chatbot.

Though Rasa has a wide documentation I’ve tried quite difficult to find documentation on Rasa NLU about implementation of Support Vector Machines. Another weakness about Rasa is the fallback action used when the bot don’t understand the user input, so the bot should send a default template message to the user and revert back to the state of the conversation before the user message that caused the fallback.

I haven’t used this type of action because it doesn’t work in a properly way, the bot will continue to suggest the not right answer; happens the same in “Sara” the Rasa bot used on the website.

Despite these weaknesses I consider Rasa extremely versatile with the opportunity to deploy it into production with many tools and in continuous evolution.

A special thanks to Lorenzo Invernizzi and Frenk Locmelis for their initial contribute to this project.

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

My Github repository: https://github.com/claudio1975/chatbot

References:

https://github.com/RasaHQ/rasa-workshop-pydata-berlin

https://arxiv.org/abs/1712.05181

https://rasa.com/docs/

Source: Artificial Intelligence on Medium

(Visited 14 times, 1 visits today)
Post a Comment

Newsletter