Define, modify or remove test scripts

Define, modify or remove test scripts

A test script is a file which contains test cases and the set of outcome attributes (both global and entity attributes, including defined tolerances) that will be used by the test cases. Oracle Policy Modeling has an integrated regression tester which can be used to create test scripts so as to compare outcomes from a rulebase with another set of outcomes.

Test scripts use the runtime model of the rulebase so if you make any changes to your rulebase while regression testing you will need to close and re-open your test script for those changes to be reflected in your test script file.

What do you want to do?

Create a new test script file

Create new test cases

Copy an existing test case

Create input data

Specify expected results

Create an outcome set

Modify a test script

Validate a test script

View the details of a test script

Remove a test script

Change the platform that the regression tester runs on

Create a new test script file

To add a new test script file to your project:

  1. In Oracle Policy Modeling, select the Test Scripts folder in the Project Explorer.
  2. Right-click and select Add New Test Script File from the pop-up menu.
    A new test script file will be added to your project. The new file will be selected and highlighted in the list.
  3. Type a name for your test script file, for example, "Test Scripts".
  4. Save your project by selecting File | Save All.

 

TIP: Multiple test scripts can exist in a project. Using a single test script on a large project may present problems if the project is under source control since, generally speaking, only one person can edit a file at a time. To ameliorate this problem multiple test scripts can be defined so that each can be edited separately. Multiple test scripts may also be defined to enable different reports to be created for a given set of test cases and/or to enable the use of different outcome sets for a test script.

Create new test cases

A test case is a combination of an input data set and expected results.

Test cases can be created, edited and deleted in Oracle Policy Modeling.

 

To add a new test case to your test script:

  1. In Oracle Policy Modeling, open your test script file by double-clicking it in the Project Explorer.
  2. Select the test script file in the Test Cases tab, right-click and select New Test Case from the pop-up menu. A new test case will be added to your test script. The new test case will be selected and highlighted in the list.
  3. Type a name for your test case (see Tips below), then press Enter.

 

 

Test cases can also be imported and exported to allow for external creation and editing. See Import test cases from another project and Create a test case from within an interview for more information.

Copy an existing test case

To create a copy of an existing test case in your test script:

  1. In Oracle Policy Modeling, open your test script file by double-clicking it in the Project Explorer.
  2. Select the test case you wish to copy in the Test Cases tab, right-click and select Copy from the pop-up menu. The test case will be copied to a new test case called "Copy (1) of <original test case name>".
  3. Rename the new test case as required.

Create input data

Once you have created your new test case, you need to set up the input data for your test case. The input data is the set of data from which the actual results (outcome values) of the test case are generated. The input data contains attribute instances and entity instances, along with the values that should be assigned to them.

The test case editor is used to investigate goals, infer relationships and set values for base level attributes in Oracle Policy Modeling. The test case editor can be accessed by double-clicking a test case on the Test Cases tab in the test script. (The test case editor is very similar to the debugger with a Data view and a Decision view.)

Investigate a goal

To investigate a goal in the test case editor:

  1. In the Data view select the goal you want to investigate.
  2. Right-click and select Investigate. This will open the Decision view with the attribute you have selected in the Attribute field. All of the relevant paths to the goal are shown in the text box below. Entities for which no instances have been created yet will be shown just by the relationship icon and the entity text.
  3. Work your way through the list of questions, setting answers (see below). In order to investigate any attributes which belong to an entity, you will need to add instances of that entity. (See Set up entities and containment relationships for more information.) Add your entity instances and continue investigating attributes until a value for the goal is known.

Investigate an inferred relationship

After you have added any entity instances in the test case editor, you can investigate an inferred relationship. To do this:

  1. In the Data view select the inferred relationship that you want to investigate.
  2. In the right hand pane, click the Investigate button. This will switch to the Decision view.
  3. Set the values for any base level attributes (see below). The Decision view will be updated as you go to show which entity instances have been inferred for this relationship, and the attributes contributing to this conclusion.

Set the value for an attribute

To set the value of an attribute in the test case editor:

  1. Select the attribute in the Data view or in the Decision view.
  2. Right-click and select from any of the following Set options from the menu:
    Set Value - this opens the Set Attribute Value dialog box where you can enter a value or set the value to 'uncertain' or 'unknown'. Variable values must be entered in the correct format: See Formatting of variable values. You can also specify change points for the attribute.
    Set to True - this option is only available for boolean attributes
    Set to False - this option is only available for boolean attributes
    Set to <value> - this option is only available for non-boolean text attributes. The values that appear here will be the values used in the rules or on screens.
    Set to Unknown - this option is used to clear the value of the attribute
    Set to Uncertain

 

Alternatively, you can double-click the selected attribute to open the Set Attribute Value dialog box and then select the appropriate value, ensuring that it is entered in the correct format.

After setting a value, the list of attribute values in the Data and Decision views will be updated with the value you specified, as well as the values for any other attributes which have been inferred as a result.

Create input data in an interview

Input data can also be created by setting values for attributes in the debugger or Web Determinations and then saving/exporting this data as an XDS file which can then be imported into a test case in Oracle Policy Modeling.

See Create a test case from within an interview for more information.

Specify expected results

Once you have created the input data for your test case, you need to specify the expected results for the test case. The expected results is the data set which is matched against the actual results when the input data is loaded into the rulebase. The expected results contains instances of the attributes and entities found in the outcome set. When attributes are added to or deleted from the outcome set, all the expected results of the test cases in that test script will be updated accordingly.

 

To specify the expected result for an attribute:

  1. In the Data view for the test case, select the inferred attribute that you want to add an expected result for. NOTE: The attribute must already be in the outcome set. If it is not, add it to the outcome set (see below). Attributes of inferred entity instances can be selected.
  2. Right-click and select from the following options:

  3. Option

    Behavior

    Set Expected Value...

    Opens the Edit Expected Result dialog box where you can specify a particular value for the expected result, an expected result of uncertain, or an expected result of unknown. You can also specify change points for the expected result.

    Set Expected Value to Default (<default expected result value>)

    Defaults the expected result to the value specified as the default value in the Edit Outcome dialog box.

    Set Expected Value to Current Value

    Sets the expected value to the current value of the attribute instance. The current value of the attribute instance is shown in angle brackets in the Value column in the Inferred Attributes list.

    Set Expected Value to true

    Sets the expected value to 'true'. (This option is only available for boolean attributes.)

    Set Expected Value to false

    Sets the expected value to 'false'. (This option is only available for boolean attributes.)

    Set Expected Value to Unknown

    Sets the expected value to 'unknown'.

    Set Expected Value to Uncertain

    Sets the expected value to 'uncertain'.

  4. The expected value is shown in square brackets after the current value of the attribute in the Value column in the Inferred Attributes list.

 

To do a bulk import of expected results:

  1. Right-click the test case on the Test Cases tab in your test script file and select Import Expected Results...
  2. In the Import Expected Results dialog, select where you want to import the expected results from. The options are:

    Import Expected Results Dialog in Policy Modeling

  3. (Optional) If you select Actual values generated using rulebase, you need to specify the location of the rulebase you want to import expected results from ('your target rulebase'). Tip: the expected file format for your target rulebase is a .zip file. You will find this file in the output folder of your target rulebase project. Alternatively, if you have saved a copy of the file to another location, you can specify that location.

    To specify the location of your target rulebase, in the Import Expected Results dialog, either:
  4. In the Import Expected Results dialog, click OK.

Create an outcome set

A test script will have an outcome set for its test cases and this should contain all the inferred attributes that will be used for the comparisons to determine if the rulebase produces the correct results.

The following types of attributes would be appropriate outcome attributes:

 

TIP: Too many outcome attributes increases initial start-up time and maintenance overheads, and can make the reports less manageable. The maximum number of outcome attributes should therefore be limited to 10-12 if possible. For unit testing, the choice of outcome attributes may be slightly different as the very nature of unit testing means that intermediate attributes are monitored, rather than the overall end result.

 

There are two ways to add outcomes to your test script:

 

Attributes from any entity can be added as outcomes.

Add outcomes in the outcome set editor

The outcome set editor can be accessed by clicking on the Outcomes tab in the test script file.

To add an outcome attribute in the outcome set editor:

  1. Right-click anywhere in the outcome set editor and select Add New Outcome....
    The Select Attribute to Add as Outcome dialog will be displayed.



    (By default, only inferred attributes will be shown. If you want to see all attributes, uncheck the Only show inferred attributes check box.)
  2. Select the attribute you want to add as an outcome, then click OK.
    The Edit Outcome dialog is displayed.



  3. Change the Display Text for the attribute if you want to. This is the name that will appear in the attribute list in the outcome set editor, and in the regression tester report.
  4. Change the Value from unknown if appropriate. This is the value that the attribute instance will be set to when the attribute is first created. By default this value is set to "unknown". You can also specify change points for the attribute.
  5. Enter a Threshold Value if required (see below).
  6. Click OK. The new outcome attribute will now appear in the list of attributes in the outcome set editor.

TIP: Outcomes can be reordered in the outcome set editor by right-clicking and selecting Move Up or Move Down.

Add outcomes in the test case editor

To add an attribute as an outcome from the test case editor:

  1. Right-click on any inferred attribute in the right hand pane of the Data view. Select Add as outcome....
    The Edit Outcome dialog will be displayed.
  2. Follow steps 3 to 5 above.

 

Outcome attributes are shown underlined in the Inferred Attributes list in the test case editor.

Specify threshold values

Threshold values tell the regression tester that a given test case should pass if an actual value falls within a specified range. To specify a threshold for an attribute, select the Threshold Value tab in the Edit Outcome dialog.

 

 

The following table explains how to set a threshold:

 

Setting

Applies to

Description

Value

Date, currency or number attributes

A date threshold is defined as a number of days, months or years.

A number threshold can be either an absolute value or a percentage.

Number and currency thresholds can either be integer or decimal values.

Apply threshold value to

Date, currency or number attributes

Specifies whether the threshold applies above and/or below the expected outcome, as follows:

  • Both upper and lower bounds – the threshold will be applied as Y – T ≤ X ≤ Y + T (default)
  • Upper bounds only – the threshold will be applied as Y – T ≤ X &lt; Y
  • Lower bounds only – the threshold will be applied as Y ≤ X ≤ Y + T

 

where X = Actual Result, Y = Expected Result and T= threshold value.

Ignore

 

Specifies whether unknown and or/uncertain values should be ignored, as follows:

  • Unknown values – this means that a test will pass if Expected Value = Actual Value (to within whatever threshold is specified) OR Actual Value = unknown.
  • Uncertain values – this means that a test will pass if Expected Value = Actual Value (to within whatever threshold is specified) OR Actual Value = uncertain.

Ignore results

You can flag an outcome so that any actual value for the outcome will be ignored when the test case is run. This will result in the expected outcome always passing. To do this, select the outcome attribute in the test case editor, right-click and select Ignore Result.

Delete invalid outcomes

To bulk delete attributes that are no longer used in your rulebase, right-click anywhere in the outcome set editor and select Delete Invalid Outcomes...

NOTE: If an entity no longer exists in the rulebase then all attributes belonging to that entity will be flagged as invalid.

Modify a test script

Test cases often need to be reviewed or modified to allow for changes in the rulebase. Changes can be made to individual test cases in the test case editor, or across multiple test scripts and test cases with the Update Test Script Wizard.

To make changes across multiple test scripts and test cases:

  1. In Oracle Policy Modeling, right-click on a test script, or on a folder that contains test scripts, and select Update Test Script Wizard.
    The Mass Update Test Script dialog is shown.



  2. Select from one of the following four options which are explained further below:
    1. Insert Attribute
    2. Update Attribute
    3. Remove/replace missing attributes
    4. Remove/replace invalid relationships
    5. Set relationships to be known/unknown

Validate a test script

You have the option to validate a test script when it is opened and show a warning message if:

 

To change or view these settings, go to File | Project Properties | Regression Tester Properties | General.

View the details of a test script

The Test Specification report allows you to view the details of all of your test cases at once. To view the Test Specification report for one or more test scripts:

  1. In the Project Explorer, right-click on your test script or folder containing test scripts, and select View Test Script Specification.
  2. In the View Test Script Specification dialog, select the test scripts that you want included in the report. If you want the selected test scripts included in the same report, select the Combine all test scripts into one report option.
  3. Click View. The Test Specification/s will be displayed in the right hand pane. You can save a copy of the Test Specification by clicking the Save button.

Remove a test script

To remove a test script from a project:

  1. In the Project Explorer in Oracle Policy Modeling, right-click the test script file that you want to remove and select Remove from Project.

 

NOTE: The file remains in your file system but has been removed from your Oracle Policy Modeling project. To permanently delete a file from both your file system and from your project, right-click it in Oracle Policy Modeling and select Delete.

Change the platform that the regression tester runs on

To change the runtime platform for the regression tester:

  1. In Oracle Policy Modeling, go to File | Project Properties | Common Properties | Platform.
  2. Select a different option from the Target Platform drop down list. (The options are .NET and Java, with .NET being the default platform.)
  3. Click OK.

Note that this setting also determines which platform the test script coverage analyzer and the what-if analyzer run on.