skip to Main Content

AI Language Understanding for Real-World Customer Care

Nowadays, artificial intelligence (AI) can add a flavour of intelligence to various apps and services, like chatbots for example. Microsoft’s vision of the intelligent cloud leads to more and more services being released within the Microsoft Azure ecosystem, which provide intelligence-as-a-service. There are mainly three categories, which services can be grouped into speaking about the Microsoft AI platform:

  • AI Services
  • AI Infrastructure
  • AI Tools

If you are trying to add pre-built AI models to your application or service, then Microsoft’s Cognitive Services, which are part of the AI services category, are most probably the way to go. There are currently more than 20 different Cognitive Services APIs available for various use cases.

One of those use cases is to understand the human language and derive actions based on that understanding. Therefore, Microsoft offers a service called Language Understanding formerly known as “LUIS” (Language Understanding Intelligent Service), which can be used to build natural language understanding into various applications and services. The main concept is to build a language model, which can be trained to match the use case, equipped with pre-defined entity dictionaries and supporting 12 languages currently. One of those use cases for example could be to use LUIS in order to analyse replies coming from customers in a customer-facing service bot scenario. In that case, LUIS could help to analyse the requests and messages sent by the customers to distinguish between a given set of actions, like:

  • Promising to pay by a given date
  • Asking for a copy of the original invoices
  • Asking for the proof of delivery note (POD)
  • Disputing the payment because the price is wrong
  • Disputing the payment because there is something wrong with the product

Looking at those topics, the actions, which need to be taken differ from one request to another. Therefore, LUIS should help to identify which intent the user has in order to act accordingly. Before we can setup our LUIS model, we need to group the requests into topics as follows:

  • Promising to pay by a given date (PaymentManagement)
  • Asking for a copy of the original invoices (InvoiceManagement)
  • Asking for the proof of delivery note (POD) (InvoiceManagement)
  • Disputing the payment because the price is wrong (PaymentDispute)
  • Disputing the payment because there is something wrong with the product (PaymentDispute)

Create the Language Understanding app

Now that we have our main user intents (=groups of topics), we can setup our LUIS model. So, the first thing we need to do is to go to, sign in and create a new LUIS app:

Blog | Image 1 Language Understanding

Blog | Image 2 Language Understanding

Blog | Image 3 Language Understanding

Create LUIS intents

Now that our LUIS app is created, we need to create our intents we defined earlier which we will need later to distinguish between the various user input messages:

Blog | Image 4 Language Understanding

Blog | Image 5 Language Understanding

Blog | Image 6 Language Understanding

Create LUIS entities

Now that the intents have been added, we will also need to create some entities, which are basically the variables which we can later refer to in our service, to extract key information. As we want to get the date for payments, the object the user wants to have a copy of dealing with invoices (either a copy of the invoice or a POD), and the reason why customers do not want to pay the invoice (either due to a problem with the price or with the product), we want to add three entities to our LUIS model:

  • datetime
  • invoiceObject
  • paymentDisputeReason

As already mentioned, there are some pre-built entities already available, like date and time entities, so we can add a pre-built entity and search for “date” to add “datetimeV2” which deals with various date and time patterns out of the box:

Blog | Image 7 Language Understanding

Blog | Image 8 Language Understanding

Add utterances

Now the LUIS app is basically ready to be filled with utterances (=examples) in order to train the model, which phrase is bound to which intent. So, we go ahead and add at least five sample sentences to each intent and tag our entities in each sentence as follows (remember the more utterances you add to your intents, the more precise the results will be later):

Blog | Image 9 Language Understanding

Blog | Image 10 Language Understanding

Blog | Image 11 Language Understanding

Blog | Image 12 Language Understanding

Blog | Image 13 Language Understanding

Blog | Image 14 Language Understanding

Training the LUIS model

Having added content to our LUIS model, it is ready for training. Most people think that training is the hardest part of setting up a language understanding model, but we basically just completed the hardest part, which is dealing with sample utterances. So, for training the model, all we have to do is click on “Train” in the top right corner and wait a few seconds, until the model is trained:

Blog | Image 15 Language Understanding

Testing the LUIS model

When the training is completed, we can already go ahead and test our LUIS app right within the LUIS portal. Just select “Test” in the top right corner and enter a sentence which is not part of one of your sample utterances, to see if the intent will be recognised accordingly (you also see the score, which should be relatively high to make it useable, as well as the entity extraction which you can use in your service later on to narrow done the steps you need to take):

Blog | Image 16 Language Understanding

Publishing the LUIS model

If you are happy with the test results, all you need to do now is to publish your LUIS app to activate your endpoint, which you can then use in your code to trigger to LUIS model. Again, in the top right corner hit “Publish” and select your environment (either Production or Staging):

Blog | Image 17 Language Understanding

After your app is published you will be presented with your keys and your ID, which you will need each time you call your LUIS endpoint from your custom application or service (i.e. your chatbot). So, copy your “Authoring Key” as well as your “Key1” from the “Keys and Endpoints” page and copy your “Application ID” from the “Application Information” page, as you need all three of them within your code:

Blog | Image 18 Language Understanding

Add LUIS to your application’s logic

Now you need to add LUIS to your application’s code logic, like your Microsoft Bot Framework bot and after you have done that, you can ask your bot some questions and it will detect your intent and respond accordingly:

Blog | Image 19 Language Understanding

Blog | Image 20 Language Understanding

Blog | Image 21 Language Understanding

Now thinking one step further, if your bot or service cannot handle the request on its own, it can still pass on the information to a human agent for further processing. This adds extreme value in a customer care scenario, as human agents save time dealing with routine tasks answering the same questions repeatedly and have more time for complex requests, which cannot be automated by a chatbot for example.

By Stephan Bisser (Microsoft MVP), Editorial Contributor


About the author

Stephan Bisser is a Technical Lead with a passion for Artificial Intelligence (AI) and the Microsoft Bot ecosystem.

After continuous outstanding contributions in this area, Stephan has earned his stripes and was awarded the honour of Microsoft MVP for Artificial Intelligence in 2018.




Interested in finding out more about it, and how we can help in your organisation? Let’s talk!

About the Author

Stephan Bisser is a Microsoft MVP in Artificial Intelligence. Stephen is author of the Conversational AI blog, cofounder of the BotBuilder Community, a community initiative supported by the Microsoft Bot Framework team to enhance the Bot Framework SDKs and tools with even more functionality, cofounder of the initiative called SelectedTech, set up to spread the word on interesting tech topics around Microsoft 365 and AI, and co-author of the book Microsoft AI.


Back To Top
Contact us for a chat