Introduction

This 45-minute tutorial shows you how to integrate generative AI into your skill by connecting it to a large language model, or LLM. An LLM is an AI system that has been trained on vast amounts of text data. These models can be trained on various domains to perform a variety of tasks, including audience-specific text summarization and sentiment analysis of chat histories.

By integrating your skill with an LLM, you enable it to not only field a range of user input, but also to respond with context-appropriate answers in a human-like tone. To help the LLM predict the most likely words or phrases for its responses, you send it the appropriate context and instructions in a block of text known as a prompt. In response, the LLM generates a completion, a sequence of words or phrases that it believes are the most probable continuation of the prompt.

Scenario

For this tutorial, the LLM that you're going to integrate a skill with is the Cohere Command model. The skill will use this model to generate an email to a sales team about their progress on closing a deal with a company called Elemental Design.

Objectives

  • Create an LLM service that provides access to the Cohere Command model for your Oracle Digital Assistant instance.
  • Create an event handler that transforms the REST payloads to and from the Cohere Command format to the format used by Oracle Digital Assistant.
  • Add a state to the dialog flow that connects users to the LLM service provider.
  • Add the prompt that describes the email to the Cohere Command model.
  • Test the prompt.

What Do You Need?

  • Access to Oracle Digital Assistant Version 23.10 or higher
  • An LLM service for Cohere. This includes:
    • A POST endpoint to Cohere's Command model

      Note:

      While Cohere has introduced both a new API (/chat) and model (cohere.command.R), this tutorial is based on the /generate API and the associated Cohere.command model. The API for the /generate endpoint is still valid, but if you wish to use the /chat endpoint instead, then you will need to manually update the request and response payloads.
    • A Cohere-provided bearer token
    • Sample provider-specific request and static response payloads
  • The following artifacts, which are included with this tutorial:

Task 1: Create the LLM Service for the Cohere Command Model

Your first task in integrating your skill with an LLM is to add a service to your instance that calls the /generate endpoint. If a service has already been configured for your instance, or if you're taking this tutorial in a lab setting where this service has already been provided, then take note of the service name and then move on to the next step where you create the skill.

  1. With the Oracle Digital Assistant UI open in your browser, click main menu icon to open the side menu.
  2. Expand Settings > API Services.
  3. The API Services menu item
  4. Open the LLM Services tab.
  5. The LLM Services tab
  6. Click +Add LLM Service.
  7. Complete the LLM Service dialog to create a POST operation to the provider's endpoint.
    • Name: Enter an easily identifiable name. You'll be reference this name later on.
    • Endpoint: Copy and paste the Co.Generate endpoint to enable access to the Command model. For example:
      https://api.cohere.ai/v1/generate
    • Methods: Select POST.
  8. Click Create.
  9. The Create LLM Service dialog
    Description of the illustration
  10. Complete the service by adding the API key, and the request and response payload samples:
    • Authentication Type: Select Bearer Token. Then copy and paste the Cohere-provided token value.
    • POST request: Select application/json as the Content Type.
    • Body: Add the Command-specific sample request body:
      {
          "model": "command",
          "prompt": "Generate a fact about our milky way",
          "max_tokens": 300,
          "temperature": 0.9,
          "k": 0,
          "stop_sequences": [],
          "return_likelihoods": "NONE"
      }
  11. For Static Response, choose 200-OK. Then enter the following sample payload:
    {
        "meta": {
            "api_version": {
                "version": "1"
            }
        },
        "generations": [
            {
                "id": "6c92c66b-36d2-41f7-a08e-363fa90a40f6",
                "text": " The largest black hole in the Milky Way is called Sagittarius A*, which is located at the center of our galaxy. It has a mass of approximately 4.3 million times that of the Sun."
            }
        ],
        "id": "72f07ada-a5b4-4e69-bd5b-5f1e98cd5784",
        "prompt": "Generate a fact about our milky way"
    }
  12. Click Test Request to check for a 200 response.

Task 2: Import the Mock REST Service

In this step, we will add the REST service to the instance that provides mock data for the opportunity.

  1. If you're not still in the API Services page from the last step, reopen it by clicking Settings > API Services.
  2. Open the Rest Services tab.
  3. The REST services tab.
  4. Click More. Then select Import REST Services.
  5. Browse to, then select the RESTService-Tutorial_OpportunityDetails.yaml file from RESTService-Tutorial_OpportunityDetails.zip file that you extracted to your computer.
  6. Click Open.
  7. Confirm that REST service has been imported by clicking in the left menu.
  8. Description of the image follows
    Description of the illustration

Task 3: Import the Starter Skill

With the REST and LLM services added to the instance, you now need import a skill and connect users to it through its dialog flow definition.

To import this skill:

  1. With the Oracle Digital Assistant UI open in your browser, click main menu icon to open the side menu.
  2. Click Development and then select Skills.
  3. A description of this image follows.
  4. Click main menu icon again to collapse the side menu.
  5. Click Import Skill (located at the upper right).
  6. Navigate to, and then select the Generate_Email_Skill_XXX(1.0).zip that you've downloaded to your computer. Then click Open.
  7. Open the skill and take a look at its intents, flows, and entities.
    • Intents -- The GenerateEmail intent.
    • Entities -- The EmailDetails composite bag that contains the Topic item (a STRING) and Opportunity, a value list entity of company names. This entity is associated with the GenerateEmail intent. There is also the CustomerRequirement entity, a value list entity whose values include competitive pricing, high performance, and high quality product.
    • Flows -- The starter flow, GenerateEmail, which is mapped to the GenerateEmail intent. This flow has two states: GetEmailDetail, which resolves the EmailDetail composite bag entity, and the GetCompanyDetails state, which returns the payloads from the Tutorial_OpportunityDetails REST service. The flow also has the following flow-scoped variables:
      Variable Name Value
      EmailDetail References the EmailDetails composite bag entity to capture user input.
      OpportunityDetail A map that stores the payload of the Tutorial_OpportunityDetails REST service.
      EmailSignature A static string

      Note:

      Check the status of the Train button to make sure that training has started.

Task 4: Connect the Skill to the Cohere Model

We're now going to enable the skill to access the Cohere service by creating a custom component with an event handler that transforms the REST payloads into formats that are accepted by the Command model and Oracle Digital Assistant.

Complete the following steps:

  1. Click Components The Components icon in the left navbar.
  2. The Components icon in the left navbar
  3. Click Add Service.
  4. In the Create Service dialog:
    • Enter Cohere in the Name field.
    • Accept the default setting, Embedded Container.
    • Select New Component.
    • Select LLM Transformation from the Component Type drop down list.
    • The Component Type dropdown menu in the Create Service dialog
    • Enter Cohere_Command in the Component Name field.
    • Select Cohere (located under other) from the Template drop down list.
    • The Template drop down menu in the Creaate Service dialog.
  5. Click Create.
  6. The Create Service Dialog
    Description of the illustration

    The Edit Component page opens after the service has been deployed, displaying the template generated for the Cohere Command model.
    The Edit Component Code page
    Description of the illustration

    Its transformation handlers call the following methods that map the model-specific payload format to the interface used by Oracle Digital Assistant, which is known as Common LLM Interface (CLMI).
    • transformRequestPayload
    • transformResponsePayload
    • transformErrorResponsePayload
    Because you selected the Cohere template, these handlers already include the Cohere.command-specific transformation code. No editing is required. If your skill calls a non-Cohere model (or the cohere.command.R model), then you'll need to manually update the handlers.

    Note:

    This handlers in this template support the /generate API and the associated Cohere.command model, not the /chat API that's used for the cohere.command.R model. If you wish to use the /chat endpoint for this tutorial, then you will need to manually update the request and response payloads in the generated template.
  7. Click Close (located at the upper right) to return to the Components page.
  8. The deployment status indicator message.
  9. Ensure that Service Enabled (the default setting) is switched on.
  10. The Create Service Dialog
    Description of the illustration

Task 5: Define the Cohere LLM Service for the Skill

To enable the skill to connect users to the Cohere endpoint through the dialog flow, you need to create an LLM Service. This is a skill-level service that combines the instance-wide Cohere LLM service with the Cohere_Command transformation event handler that you created in the previous step.

  1. Click Settings The Settings icon in the left navbar.
  2. The Settings icon in the left navbar
  3. Open the Configuration page.
  4. The Configuration tab in Settings
  5. In the Large Language Models Services section (located near the bottom of the page), click +New LLM Service.
  6. The Large Language Model Section of the Configuration page.
    Description of the illustration
  7. Complete the following fields:
    • Name: Enter CohereService. You'll reference this name when you build the dialog flow in the next step.
    • LLM Provider: Select the name of the Cohere LLM service that you created in Task 1: Create the LLM Service for the Cohere Command Model.
    • Transformation Handler: Select Cohere_Command.
    • Leave the remaining properties in their default settings. Note that Default is switched on (true) if this is the only service that you've created so far for this tutorial.

      Important:

      Be sure that Mock is switched off.
    • Click the Save icon (located at the right in the Action column).
      The Save service icon
    • The Large Language Models section of the Settings page with values.
      Description of the illustration

Task 6: Integrate the Command Model with the Skill

Now that the skill is connected to the service for the Command model, you're now going to connect your skill's users to the model by creating a dialog flow component that can call the model and tell it what to do. The component conveys these instructions using a prompt, which is a block of human-readable text.

  1. Click Flows The Flows icon in the left navbar.
  2. The Flows icon in the left navbar
  3. Select the GenerateEmail flow.
  4. The GenerateEmail Flow

    If there's an error The error icon. on the GetCompanyDetails state, open the state's Component tab and make sure that the following properties have been configured:

    • Rest Service: Tutorial_OpportunityDetails
    • Authentication Type: No Authentication Required
    • Endpoint: https://www.opportunitysummary.com
    • Method: GET
    • Parameters:
      Key Value Type
      OPPORTUNITY ${EmailDetail.value.Opportunity.value} Query
    • Response Mode: Always Use Static Response
    • Result Variable (Flow Scope): OpportunityDetail

  5. In the GetCompanyDetails state, click The menu icon and then select Add State from the menu.
  6. The Add State option
  7. Select Service Integration.
  8. The Add State dialog with Service Integration selected
    Description of the illustration
  9. Select Invoke Large Language Model.
  10. Enter GenerateEmail in the Name field.
  11. In the Description field, enter Email generator. Then click Insert.

    Note:

    As a best practice, always add descriptions to the invokeLLM states. These component descriptions enable multi-turn conversations when users access an LLM skill through a digital assistant.
  12. The Add State dialog with Invoke Large Language Model selected
    Description of the illustration
    The the dialog flow now includes the GenerateEmail and the showLLMError states.
    The invokeLLM state with the showError state in the dialog flow
    Description of the illustration
  13. Click the GenerateEmail state.
  14. In the Component tab, select CohereService for the LLM Service field.
    The LLM Service field
    Description of the illustration

    Note:

    If CohereService is the only LLM Service that you've created, then you can select Default.

Add the Prompt and Prompt Parameters

In this step, you're going to add the prompt that describes the type of email expected from the model.

  1. Click Build Prompt to open the Prompt Builder, a tool that enables you to iterate, or engineer, your prompt.
  2. The Build Prompt button
  3. Paste the prompt text into the Prompt field. Then click Save Settings (located at the bottom right).
  4. The Prompt Builder
    Description of the illustration
  5. The prompt text references the the variable values that are passed in for the OPPORTUNITY, TOPIC, OPPORTUNITY_DETAILS, and EMAIL_SIGNATURE parameters. For example:
    Draft an email to the ${OPPORTUNITY} sales team
    
    For the LLM to incorporate these parameters, they need values. Because these values are missing, the editor notes errors. You will add these values in the next step.
  6. Errors for undefined parameter values.
    Description of the illustration
    To provide the LLM with the parameter values it needs to generate the email, you need to provide FreeMarker expressions for each parameter. Because the parameters provide the LLM with values from various sources (composite bag entity items and the OpportunityDetails REST service), the FreeMarker syntax will vary. To add these parameters:
    • Click Add The Add icon next to Prompt Parameters.
    • The Prompt Parameters label
    • Enter the parameter name in the Name field.
    • Enter the FreeMarker iterator expression in the Value field.
    • Click Save The Apply icon.
      The Prompt Parameters section fo the Component page
      Description of the illustration
      Add the following parameters and expressions.
      Parameter FreeMarker Expression Variable Value Source
      OPPORTUNITY
      ${EmailDetail.value.Opportunity.value}
      Composite Bag Entity (list value item)
      TOPIC
      ${EmailDetail.value.Topic}
      Composite Bag Entity (STRING item)
      OPPORTUNITY_DETAILS
      ${OpportunityDetail}
      Flow-scoped variable; value provided by REST service
      EMAIL_SIGNATURE
      ${EmailSignature}
      Flow-scoped variable (static value)
      The error messages will disappear as you define the parameters. When you're finished, all of the error messages should be gone.
  7. Click The expand icon to expand the User Messaging section of the Component tab.
  8. The User Messaging section of the Component page
    Description of the illustration
  9. Set Use Streaming is set to False so that the message is delivered in its entirety, not incrementally. We recommend that you disable streaming for Cohere models.
  10. The Enable Streaming option
  11. For the Standard Actions, remove all of the actions except for Undo.
  12. The Undo action

Task 7: Test the Prompt with the Prompt Builder

Before we test the prompt that you added in the previous step, let's take a quick look at it.

This prompt reflects good prompt design because:

  • It assigns a persona to the LLM that is use case-specific:
    You are a professional email writer.
  • It provides brief and concise instructions:
    Draft an email to the ${OPPORTUNITY} sales team for the following purpose: ${TOPIC} considering the following details about the opportunity:
    
    Opportunity details:
    
    ${OPPORTUNITY_DETAILS}
    
    Email Signature:
    
    ${EMAIL_SIGNATURE}
  • It defines clear acceptance criteria:
    - Your email should be concise, and friendly yet remains professional.
    - Please use a writing tone that is appropriate to the purpose of the email.
    - Ensure to make it obvious when the opportunity is to close.
    - Optionally include the things that are important to the customer when making their buying decisions
    - If the purpose of the email is negative; for example to communicate miss or loss, do the following: { Step 1: please be very brief. Step 2: please and do not mention activities }
    - If the purpose of the email is positive or neutral; for example to congratulate or follow-up on progress, do the following: { Step 1: the products section is the main team objective to achieve, please mention it with enthusiasm in your opening paragraph. Step 2: please motivate the team to finalize the pending activities. }
    ...		 

    Writing prompts is an iterative process. In fact, continually refining your prompt is a best practice. It may take several revisions before a prompt returns the results that you expect. To help you through this revision cycle, you can use the Prompt Builder to incrementally test and modify your prompt until it functions properly.

    The Prompt Builder
    Description of the illustration

To test the prompt, you need to add mock values for the referenced parameters. The tone and content of the model's output is based on these values. You can have the model generate random values by clicking Generate Mock Values, but to control the output, you need to add your own. To add these values:

  1. In the Component tab, scroll back to the top and click Build Prompt to open the Prompt Builder.
  2. The Build Prompt button
  3. Click Edit the Edit icon.
  4. Enter value for the parameter in the Mock Value field. When you're done, click Apply the Edit icon.
    The Prompt Parameters dialog
    Description of the illustration
    Here are the example values:
    Parameter Mock Value
    EMAIL_SIGNATURE John Smith
    OPPORUNITY_DETAILS Completed
    OPPORTUNITY Elemental Design
    TOPIC Congratulations
  5. After you've completed the mock values, click Generate Output.
  6. The Generate Output button
  7. Verify that the LLM output in the LLM Output field is an email that both adheres to the prompt guidelines and incorporates the parameter values. For example, the output may be like this:
    Hi Team, I just wanted to send a quick congratulations to all of you for closing this opportunity. The Elemental Design sales team worked tirelessly to finalize this opportunity and their efforts have paid off! This opportunity was particularly noteworthy because of the following factors (if relevant): The customer was very hesitant about their needs at first, but your team showed exceptional patience and provided them with all the information they needed to make an informed decision. The customer also had a strict budget, and your team worked hard to provide a solution that fit their needs and their price range. Going forward, I think it's important to keep in mind what contributed to our success with this opportunity. Your ability to provide thorough and thoughtful solutions to customers is key to our team's success, so keep up the great work! I also want to motivate the team to finalize the pending activities for this opportunity. We still have a long way to go this quarter and I know that you all can continue to reach new heights. Thank you again for your hard work and dedication! Sincerely, John Smith Is there anything else I can do to help draft an email to the Elemental Design sales team?
  8. The Prompt Builder
    Description of the illustration
  9. Optional step -- Here are a couple of things for you to experiment with before you click Close The Close button to return to the Component tab:
    • Replace the optional directive (around Line 14):
      - Optionally include the things that are important to the customer when making their buying decisions
      with
      - Include the things that are important to the customer when making their buying decisions.
      Then click Generate Output. Your results may differ, but here's an example:
      Hi Team, I just wanted to send a quick congratulations to all of you for closing the opportunity with [Client Name]. This was a big opportunity that required a coordinated effort from everyone on the team, and I'm really proud of what we were able to achieve together. When looking at what was important to the client when making their buying decision, it's clear that our team's ability to demonstrate the value of our products was a key factor in closing this deal. Our portfolio of market-leading products provides unparalleled advantages to our clients, and I believe that this opportunity will be the first of many successes for us this year. For the upcoming opportunities, let's continue to motivate each other to finalize the pending activities. I have no doubt that with our team's skills and cooperation, we will be able to repeat this success and achieve our sales goals this quarter. Thanks again for your hard work and dedication. I'm excited to see what we can achieve next! Sincerely, John Smith Is there anything else I can do to help draft an email to the Elemental Design sales team?
    • Allow the model to generate more creative responses by changing the Temperature from 0 (straightforward responses) to 1 (more randomized responses).
      The Temperature option
      Click Generate Output. Here's an example (your results may again differ):
      Hi Team, I just wanted to send a quick congratulations to all of you for closing the opportunity with [Client Name]. This was a big opportunity that required a lot of effort and teamwork, and I am proud of everything that we have achieved. Our products were a perfect fit for [Client Name]'s needs, particularly because [explain why]. Their decision to purchase from us rather than from our competitors is a testament to the hard work and dedication that goes into everything we do here at Elemental Design. I want to give a special shout-out to [Name] for leading the charge on this one. We would not have closed this deal if it were not for their persistence in pursuing the client and making sure that we were delivering exactly what they needed. It was a team effort, however, and I want to thank everyone for their contributions. Now, let's keep up the good work and keep pursuing new opportunities to make 2022 our best year yet! Sincerely, John Smith [Your Name] [Your Title] [Company Name] [Your Contact Information] Would you like me to make any other changes to this email to better suit your needs?

      Note:

      Do not click Save Settings as this will overwrite the original prompt text.
  10. Click Close The Close button to exit the Prompt Builder.

Task 8: Test the Prompt with the Skill Tester

Now that you've verified that the LLM can receive the skill's input, you're ready to interact with it in the Skill Tester.

  1. Open the Skill Tester by clicking Preview (located at the upper right).

  2. The Preview button
  3. Enter the following request:
    Send an email to the Elemental Design sales team that congratulates them on the progress and let them know that we should talk strategy for closing in tomorrow's meeting.
    The output may look something like this.
  4. The skill Tester
    Description of the illustration
  5. In the Conversation pane, notice that the conversation remains in the GenerateEmail state.
    The Conversatino view in the Skill Tester
    Description of the illustration
    That's because of two reasons: there is no error that would cause the dialog flow to move on to the showLLMError state and the component is configured for multi-turn interactions that allow you to refine the LLM output.
  6. To get a look at the outcome the GenerateEmail state processing, open the LLM Interaction tab.
  7. The Preview button
    This view renders when the dialog flow lands on an LLM component state like GenerateEmail. From it, you can compare the outcome of the state's processing at each turn of the conversation with the result that's sent to skill users. You can also can review the prompt populated with variable values. In our case, we want to find out which values were sent for the OPPORTUNITY, TOPIC, OPPORTUNITY_DETAILS, and EMAIL_SIGNATURE variables. To access the prompt in this form, hover over the text in the Initial Prompt/Refinement column, right-click, then choose Show Full Text.
    The Conversation view in the Skill Tester
    Description of the illustration
    By scrolling along the Prompt window, note, for example, that ${OPPORTUNITY} and ${TOPIC} in the original prompt text have been replaced by Elemental Design Team and the string that you just entered, ("Send an email to the Elemental Design sales team that congratulates them...").
    The Prompt view.
  8. Click Close The Close button to exit the Prompt window.
  9. To compare the model's processing to the result that's output by the skill, hover over the text in the Outcome field, right-click, then select Show Full Text.
    Show Full Text option.
    Because the model processed an error-free and valid response, the contents in the Outcome field match those in the Result field. The two may not always match, as you'll find out in Task 9: Extra Credit - Validate the LLM Output.

  10. Show Full Text option.
  11. Click Close The Close button to exit the Outcome window.
  12. Returning to the Bot Tester window, try refining the output by entering the following:
    Include the things that are important to the customer when making their buying decisions
    The LLM should incorporate this feedback into its response. For example, it might include something like the following:
    With this opportunity coming to a close on August 15th, it's important to send out those product samples by June 10th and schedule a product demo by June 15th. As a team, it is crucial to meet to negotiate pricing and terms by June 25th. These are the things that are important to the customer when making their buying decisions, so it's vital that we complete these steps in order to give them the best experience possible and close the deal
  13. The Skill Tester
  14. This new iteration of the message now includes the Undo button which reverts the output to the previous response.
  15. Open the LLM Interaction tab. The view reflects outcome of the processing for this second conversation turn. Even though a second turn has been executed, the conversation remains in the GenerateEmail state.
  16. The Conversation view in the Skill Tester
    Description of the illustration
  17. Click Reset, then close the Skill Tester.
  18. The Reset button

Task 9: Extra Credit - Validate the LLM Output

In this step, you're going to use the declarative validation functions of the InvokeLLM component to test the LLM output for the presence (or absence) of one of the values defined for the CustomerRequirement such as competitive pricing, customizable design, etc.
  1. Open the GenerateEmail state and select the Component tab.
  2. Expand User Messaging again. Note the Use Streaming setting, which you set to False in Task 6: Integrate the Command Model with the Skill to accommodate the Cohere model. Despite the model, you must always disable user streaming when validating the LLM output because messages can't be validated in chunks. They can only be validated when they're complete. If you enable both user streaming and validation, users may see multiple streams of output, which may confuse them.
  3. The Use Streaming field
  4. Click The expand icon to expand Response Validation.
  5. The Response Validation field
    Description of the illustration
  6. In the Validation Entities field, select CustomerRequirement.
  7. The validation entity list.
    Description of the illustration
  8. Open the Skill Tester.
  9. Enter the following request:
    Send an email to the Elemental Design sales team that congratulates them on the progress and let them know that we should talk strategy for closing in tomorrow's meeting.
    The skill will reply with an "Enhancing the response. One moment, please..." message.
    The Skill Tester
    Description of the illustration
    The message that follows it may be valid because it incorporates the CustomerRequirement entity values like competitive pricing or durability, or it may not be valid because values like these are missing. When the message is not valid, the dialog flow transitions to the showLLMError state, which outputs an error message that names the entity whose values were not matched (CustomerRequirement), the HTTP status code returned by the call to the LLM (200) and the CLMI (Common LLM Interface) error code noting that output failed validation (responseInvalid):
    An unexpected error occurred while invoking the Large Language Model: {"errorMessage":"The CustomerRequirement is not specified in the response.","errorStatus":200,"errorCode":"responseInvalid"}
    The dialog flow in the error state.
    Description of the illustration

    In the LLM interaction tab, the Result field that was rendered for the previous (successful) outcome has been replaced by the Invocation Error panel. The contents of its Error Code, Error Status, and Error Message fields match the error information that's passed to the ShowLLMError state, but for this conversation turn, the LLM Interaction view provides you with the additional detail that this error was thrown after one retry attempt (the default number of attempts allotted for InvokeLLM component states like GenerateEmail).
    LLM Interaction tab
    Description of the illustration

    If you look at the contents in the Outcome field, you'll see the generated response. Because it's missing the entity values, it failed validation. For example:
    Hi Elemental Design Sales Team, Congratulations on the progress made on opportunity SO-12345 to win the deal worth $60,000 involving athletic footwear and apparel! You've done a great job, and it's time to talk strategy for closing. Tomorrow's meeting is a vital step toward achieving the goal, so keep it up! I want to express my enthusiasm for the team's efforts in engaging the customer's interest in the products and moving forward with the deal. You are on track to close the opportunity on August 15th, and I want to ensure we are ready for the next steps. I'll see you tomorrow at the meeting. Let's discuss further so we can make sure we're on the right track. Sincerely, John Smith, Regional Sales Director [Remove signature block]

    Here's an example of valid output. In this case, the dialog flow doesn't transition to the showLLMError state, but remains in the GenerateEmail state as it did in Task 8: Test the Prompt with the Skill Tester.
    Hi Elemental Design team, Congratulations on the progress you've made on opportunity SO-12345! The team has been working tirelessly and it has paid off. I'm confident that we're going to be able to close this deal soon. These are the things that are important to the customer when making their buying decisions, namely, high-quality materials for durability and performance, competitive pricing to fit within their budget, responsive customer support and after-sales service, and customizable design options for team branding. It is vital that we keep these in mind when working on this opportunity going forward. I'll see you tomorrow at our meeting where we can discuss our strategy for closing this opportunity. In the meantime, make sure you're focusing on sending out those product samples and scheduling that product demo. We need to move quickly to finalize the pending activities, notably those that are scheduled for June 10th and June 15th. There is no doubt that we will be able to achieve our objective if we remain focused.

Provide links to additional resources. This section is optional; delete if not needed.

More Learning Resources

Explore other labs on docs.oracle.com/learn or access more free learning content on the Oracle Learning YouTube channel. Additionally, visit education.oracle.com/learning-explorer to become an Oracle Learning Explorer.

For product documentation, visit Oracle Help Center.