Oracle by Example brandingBest Practices for Building and Training Intents


section 1Before You Begin

This 75-minute tutorial demonstrates some good practices for training and improving your intents.

Background

The purpose of this lab is to become familiar with tips and techniques for training and testing your skills. Specifically, you will add utterances to a skill to train it to understand several use cases. Then you'll work with a colleague to iteratively test your skill, adding more test data as you go along to fine-tune intent resolution.

What do you need?

  • Access to an Oracle Digital Assistant instance.
  • Download Pizzeria_Starter.zip which has the basic intents for testing.
  • Optionally (but highly recommended!), a test partner.

section 1Get the Pizzeria_Starter Skill

We'll start with a skill that has been pre-populated with a few intents. You will first import the skill and then clone it.

If you are doing this tutorial with a partner, just one of you needs to import the skill. (In fact, once one of you has imported it, it's not possible for the other to do the same.)

Import the Starter Skill

  1. If you haven't done so already, download Pizzeria_Starter.zip.
  2. Log into Oracle Digital Assistant.
  3. Click menu icon in the top left corner to open the side menu.
  4. Expand Development and then click Skills.
  5. Click Import Skill (located at the upper right).
    An image of Skills dialog box with the Import Skill button highlighted.
  6. Browse to, and then select, Pizzeria_Starter.zip. Then click Open.

Clone the Starter Skill

To make sure that your work doesn't collide with others doing this tutorial, you will work with a clone of the skill.

To create a clone of the skill:

  1. On the Skills page, within the tile for the Pizzeria_Starter skill, click the Options menu icon, select Clone.
  2. In the Display Name field, enter <YourInitials>_Pizzeria_Starter.

    For example: JS_Pizzeria_Starter.

  3. Select the Open cloned skill afterwards checkbox.
  4. Click Clone.

    The clone of the skill should open on the Intents page.


section 2Enable Insights

We'll later be using the Insights Retrainer to retrain the skill to properly handle phrases that it mis-classifies.

The Insights feature should already be enabled in your skill. To make sure:

  1. In the left navigation of your skill, click the Settings icon.
  2. Select the General tab.
  3. Make sure that the Enable Insights switch is in the On position.

section 3Create Utterances and Do a Round of Batch Testing

The FileComplaint, OpenFranchise, and TrackOrder intents don't have any example utterances yet, so we'll need to create those. Along the way, we'll create some extra utterances and use those as testing data to help measure the success of the skill in understanding a conversation with a user.

Here's a rundown on the purpose of intents:

  • FileComplaint – used when the customer has an issue that needs to be resolved, probably by a live agent.
  • OpenFranchise – allows the customer to inquire about opening up a franchise to sell pizzas online.
  • TrackOrder – used when the customer wants to check the status of an order, cancel an order, or check the progress of an order.

Read Up on How to Write Good Utterances

To get off to the best possible start with your training model, you may want to spend a few minutes reading up on how to create a robust set of initial utterances. See this TechExchange article on best practices for writing utterances.

Create Utterances

  • In Notepad or a similar text editor, write phrases that you think would be representative utterances for the FileComplaint, OpenFranchise, and TrackOrder intents.

    Write 10 utterances each for those intents.

Add Utterances to the Intents

For each intent, you'll add seven of the utterances you have just written.

  1. Select the Intents icon.
  2. Select the FileComplaint intent.
  3. Next to the Examples section, click Advanced input mode.
    Screenshot showing the Examples section of the page. It includes the title ('Examples'), the label 'Utterances to Add' and the clickable text 'Advanced input mode'
  4. In the Enter your example utterances here field, paste the seven of the utterances that you have written for FileComplaint and then click Create.
  5. Repeat the steps 2 and 4 for OpenFranchise and TrackOrder.
  6. Click Train, leave Trainer Tm selected, and click Submit.

Create Batch Tests

Now, using the utterances that you created but didn't add to the skill, you'll create batch tests for each of the intents. As a starting point, you'll export the intents of the skill to get a .csv file with the correct format.

  1. Select More > Export intents.
    screenshot showing the selection of the Export intents menu item from the More menu.
  2. Save the file as batchtest1.csv.
  3. Open batchtest1.csv.

    Note that the first line of the file delineates the fields that you need for each entry:

    query, topIntent, conversationName, answer, enabled

    The query field refers to the entry's utterance. The topIntent field refers to the intent that the utterance should match with.

    Below the first line are entries for each of the utterances that are in the skill.

    To create the batch test file, you will replace the entries for the skill's intent utterances with a set of test utterances.

  4. Delete all of the entries that have OrderPizza as the topIntent.
  5. For each of the other top intents, retain three entries and delete the rest.
  6. In the remaining entries, replace the utterances with the utterances that you wrote earlier but did not add to the skill (and make sure that the utterances correlate with the appropriate topIntent).
  7. Save the file.
  8. Back in Digital Assistant, click Test icon.
  9. Select Intent.
  10. Slide the Batch switch to ON.
  11. Click Load.
  12. Drag batchtest1.csv into the Load Batch dialog and click Test.

Take a look at the results and take note of anything that you find surprising.

For each utterance in the test, you can click its Expand icon icon to see how well it matches with the various intents.

Screenshot of the Try Out Intents/Q&A dialog. The Intent tab is selected. There is text showing the number of tests that passed and failed. Below that, each test utterance is listed, along with an Expand icon.
Screenshot of the Try Out Intents/Q&A dialog. The Intent tab is selected. There is text the query and expected intent. Below that, intents are listed along with the confidence score for each.

Notes on What You Just Did

When developing a new skill it is likely that you don't have any existing training utterances, so you have to synthesize utterances to train the model. Here you have undertaken a good practice by using some utterances to train the model and some which have been used for batch testing.

By using some of the phrases for batch testing, you are always able to test your intents and compare whether these phrases are resolving to the intents you expect. Ultimately, you want to show that your model is resolving more and more phrases correctly over time. This is incredibly important as you go through your skill development, where you will add training utterances and/or change the model as more people use it.


section 4Iteratively Test Your Intents

At this point, you have done one round of testing. To make your training corpus more robust, you'll want to do several more rounds and make any necessary adjustments to your utterances as you go along. This iterative approach is a good practice for your skill development.

As part of this, you'll want to get other people involved in training your skill, since phrases you think of to match an intent will probably vary from what other people come up with.

In addition, you'll use the full tester (not the "Try It Now" tester), which you'll need to record the conversations in Insights.

Have a Test Partner Test Your Skill

  1. Find a test partner.
  2. Have your partner open your skill (while you open your partner's skill).
  3. Have your partner enter phrases of their own in the tester for your skill (3 phrases for each of the 3 intents).

    Do the same for your partner's skill.

    To use the tester:

    1. In the top navigation of the skill, click the tester icon.
    2. Type a phrase in the Message field and press Enter.
  4. Once you have finished entering phrases, close the tester.

Use the Insights Retrainer Improve Your Model

The Insights feature enables you to see how your users are interacting with your skill. The Retrainer sub-feature of Insights enables you to add real user utterances to your training model.

Now let's evaluate your training partner's utterances and use the Retrainer to update your training model to correctly classify the utterances that resolved incorrectly.

  1. Re-open your copy of the skill
  2. Open the skill's Insights by clicking the Insights icon.
  3. Select the Retrainer tab.

    You should see a search results page that shows a few of your lab partner's utterances that have resolved to a selected intent.

    Screenshot of part of the Retrainer tab of the Insights page. At the top, there are three dropdown lists showing Intent, Matches, and FileComplaint, respectively. Below that are buttons for Add Criteria and Search. Below that is an Intent Classification table with columns for Utterance, Result, Win Margin, Intents Score, and Add To. In the Add To column, is a dropdown containing intent names.
  4. Check to see if those utterances should be resolving to that intent.

    Note: If the selected intent is unresolvedIntent, that means that the matching utterances did not resolve to any of the three intents we've been working with.

  5. If you see an utterance that didn't resolve correctly, click the Select Intent dropdown for that entry, select the intent that it should have resolved to, and click Add Example.
    Screenshot showing the Select Intent dropdown list in one of the cells of the Intent Classification table.
  6. In the Search Criteria section of the page, select another intent and click Search.
    Screenshot showing the Search Criteria section of the page. At the top, there are three dropdown lists. The first two show Intent, Matches, respectively. The third is open and shows values for FileComplaint, OpenFranchise, and TrackOrder. Below that are buttons for Add Criteria and Search.

    You should now see the entries for the second intent.

  7. As before, evaluate whether each of those utterances have resolved correctly, and, for any that haven't, add them as examples to the appropriate intent(s).
  8. Repeat these steps for the remaining intents.
  9. Click the Train button.
  10. Leave Trainer Tm selected and click Submit.

Add More Utterances to the Batch Test

Based on what you learned from the phrases your testing partner added, you should also augment your test batch.

  1. Create a copy of batchtest1.csv and save it as batchtest2.csv.
  2. In batchtest2.csv, create one new entry for each intent by copying an existing entry for that intent.
  3. For each copied entry, replace the utterance with a new utterance.

    You can construct these new utterances by varying the wording of utterances that were entered by your test partner.

  4. Click the Intents icon.
  5. Click Try it out icon.
  6. Select Intent.
  7. Slide the Batch switch to ON.
  8. Click Load.
  9. Drag batchtest2.csv into the Load Batch dialog and click Test.
  10. Evaluate the results of the new test.
    • Has the inclusion of new phrases helped in the intent resolution?
    • Are you getting better results?
    • Are you seeing obvious misclassifications?

      We'll look at fine tuning these later in the lab.

  11. With your partner, or perhaps a new partner, repeat the exercises in this section with three more utterances for each intent.

section 5Train the Skill to Handle Spam

Now let's spend some time on the question of spam or other misuse of the skill. Up to 40% of a skill's workload may have nothing to do with the skill's intended use, and the skill needs to be able to gracefully handle this. Furthermore, training your skill to understand phrases that are outside of the use case has the benefit of helping it to disambiguate the intents it is supposed to handle.

  1. Test the skill with 10 random phrases such as "and she is buying a stairway to heaven", "tell me a joke", and "are you a lady bot" by doing the following:
    1. In the top navigation of the skill, clicking the tester icon.
    2. Typing a phrase in the Message field and pressing Enter.

      Repeat this step until you have entered 10 different random phrases.

  2. Open the skill's Insights by clicking the Insights icon.
  3. Select the Conversations tab.

    You should see entries for each of the phrases that you just entered into the tester.

  4. See if any of the phrases have resolved to any of your intents.

    For those that don't resolve to any of your intents, the value of the Intent field in the entry is unresolvedIntent.

    Spoiler alert: some of the phrases probably have resolved to your intents. You'll need to train the skill to recognize phrases that are outside of the scope of the skill and deal with them appropriately.

The unresolvedIntent Intent

To handle spam and other interactions for which the skill wasn't specifically designed, create a new intent called unresolvedIntent.

As you probably noticed before, the skill implicitly matches input that it doesn't understand as unresolvedIntent, even though there isn't an unresolvedIntent that has been specifically defined. By explicitly defining unresolvedIntent, you can add training utterances to it to reduce the chance that input will inappropriately resolve to one of your other intents.

  1. With your skill open, click the Intents icon.
  2. Click + Intent.
  3. In the Name field type unresolvedIntent.
  4. In the Examples area for the intent, enter 7 of the random utterances that you just evaluated in the log.
  5. Click Train to retrain the skill.
  6. In the most recent version of your batchtest.csv file, create three new entries by copying and pasting a previous entry.
  7. For each new entry:
    • Replace the query value with one of the three leftover unresolvedIntent utterances.
    • Replace the topIntent entry with unresolvedIntent
  8. Save the file.
  9. In your skill in Digital Assistant, click Try it out icon.
  10. Select Intent.
  11. Slide the Batch switch to ON.
  12. Click Load.
  13. Drag your batchtest file into the Load Batch dialog and click Test.
  14. Evaluate your test results.

Test Again

Now, for good measure, do additional testing with your test partner:

  1. Have your partner enter 10 random phrases into the full tester (Tester icon).
  2. Open the conversation logs on the Insights page to see what intents the phrases resolved to.
  3. Open your skill, and select the unresolvedIntent intent.
  4. Repeat steps 4-14 above to augment the unresolvedIntent with 7 of the new phrases and add 3 to the batch test file.

Notes on What You Just Did

As well as training your model to understand your intents, it is also a good practice to train the model to understand what is not an intent. You did this by defining the unresolvedIntent intent and associating phrases with it that are unrelated to the intents that support your use case.

By doing this, you are helping your skill recognize when phrases are unrelated to your use cases and thus keep these phrases from being incorrectly matched to the other intents.


section 6Quality Reports

In parallel with applying the above techniques, you'll probably want to run and evaluate quality reports.

To run a quality report:

  1. With your skill open, click the Quality icon.
  2. Click Run Report.

When you run a quality report it performs a random 80:20 split of utterances, using 80% subset to train with and 20% to test with. Since the split is random, the test results may differ every time you run the report.

After the report is completed, a set of intent pairs appears. You can then select one of these pairs to see any utterances that were misclassified during the test or which are similar to utterances in other intents. Based on this data, you may wish to further adjust the way you structure your intents.

Notes

  • The History feature only works against input through the skill Conversation tester (clicking Tester icon), not the intent or batch tester (clicking Tester). This means that you might have to manually enter some of your phrases in the skill Conversation tester.
  • Since quality reports do an 80:20 split between training and tester, some of the data you would normally use to train an intent will not be used as training material in the context of the report, which could result in misclassifications. For such misclassifications, you need to determine if they are simply a result of the way quality reports work or whether they are real and you need to add additional utterances.
  • You may find some misclassification but decide that, if it resolves to a low percentage, it is acceptable.

section 7Other Tips for Optimizing Intent Resolution

For future reference, here are some other things you can do to improve the quality of intent resolution:

  • Among your utterances, include key phrases that are specific to one intent, and add them as short phrases (not as full sentences).

    For example, if "no claims protection" is relevant for only one intent in a skill, add utterances such as "no claims protection", "protected no claims", and "no claims bonus" for that intent.

  • Repeat key utterances with some slight variations.
  • Check where you think utterances could apply to different intents. If there is significant overlap, consider combining those intents.
  • After applying any changes, be sure to rerun your tests and evaluate their impact.

Note: Ultimately the best data for training your skill will come from real user utterances. Furthermore, in most cases, Trainer Tm is better suited for resolving real-world phrases. However it does require more sample utterances to give those better results.


more informationWant to Learn More?