11 Working with A/B Testing

The Oracle WebCenter Sites: A/B Testing feature enables you to experiment with design and content variations on your website pages and determine which variations produce the optimal results, before making permanent changes to your website. Use the feature's in-context, visual method to create A/B tests on any pages delivered by Oracle WebCenter Sites.

Note:

A/B test functionality for this Oracle WebCenter Sites product is provided through an integration with Google Analytics. To use the A/B test functionality for this Oracle product, you must first register with Google for a Google Analytics account that will allow you to measure the performance of an “A/B” test.  In addition, you acknowledge that as part of an “A/B” test, anonymous end user information (that is, information of end users who access your website for the test) will be delivered to and may be used by Google under Google’s terms that govern that information.

For example, you might use A/B Testing to explore:

  • Which banner image resulted in more leads generated from the home page?

  • Which featured article resulted in visitors landing on a promoted section of the website?

  • Do visitors spend more time on the home page with a red banner displayed versus a blue banner?

  • Does adding a testimonial increase the click-through rate?

  • Which page layout resulted in more visitors downloading the featured white paper?

11.1 A/B Test Overview

Creating and running A/B tests involves the following main steps, which are described in detail throughout this chapter:

  1. Decide what the test will compare.

    An A/B test compares two or more variants, where variant A serves as the control (base), and variant B and any additional variants (C, D, and so on) are each compared with A.

    You can run multiple A/B tests at the same time.

  2. Decide what the test will measure.

    You use a goal to specify the visitor action the test will capture and compare for the variants, and then display as A/B test results. A goal refers to a specific visitor action that you identify for tracking, and you can specify the type of goal that fits your use case. The default goals are Destination, Duration, Pages per session, and Event. For example, if you choose Destination, you can specify a page likesurfing.html. The conversion occurs whenever anyone visits this page.

  3. In A/B Test mode in the Oracle WebCenter Sites: Contributor interface, create an A/B test and add variants to it.

    On the management system, create the test and use the WYSIWYG A/B test mode to add one or more variants to a selected web or mobile page. You can identify a variant by its color-coding, and easily save or discard its tracked changes. In the figure below, first variant (B) displays in green in both the A/B Test panel (B) and in each change (1) on the page. See Creating A/B Tests.

  4. Select criteria for the A/B test.

    Figure 11-2 A/B Test Criteria Pane

    A/B Test Criteria Pane with start and end, visitor, confidence, conversion, and target controls
  5. Approve the test (and its dependencies) to see the actual Experiment in Google.

    See Approving and Publishing A/B Tests.

  6. View test results in the A/B Test Report.

    As visitors view variants and conversions take place, you can view the results for individual A/B tests on the report, which maintains the same variant color-coding you saw while creating test variants. The report enables you to see if there are measurable differences in visitor actions between variants. A cookie is left so that visitors see the same variant upon returning to the site. See Viewing A/B Test Results.

    Figure 11-3 A/B Test Showing Variants

    Description of Figure 11-3 follows
    Description of "Figure 11-3 A/B Test Showing Variants"
  7. Optionally update the site to use the winning variant.

    After displaying the winning variant for a while, you might copy or create an A/B Test to include new variants, enabling you to refine iterations over time.

11.2 Before You Begin A/B Testing

Creating A/B tests requires the following prerequisites to be in place:

  • The A/B test asset type WCS_ABTest must be enabled. By default it is not enabled, except in the “avisports” and “FirstSite II” sample sites.

    A WebCenter Sites administrator or developer enables asset types.

    For more information, see Administering A/B Testing in Administering Oracle WebCenter Sites.

  • The A/B code element must be included in templates on which A/B tests will be run. This code element is not included in templates by default.

    A WebCenter Sites developer adds the A/B code element to templates.

    For more information, see Template Assets in Developing with Oracle WebCenter Sites.

  • The property abtest.delivery.enabled must be set to true on the delivery instance, that is any instance that will deliver the A/B test variants to site visitors. The property should not be set on instances that are used only to create the A/B tests, because this will give false results. The abtest.delivery.enabled property is in the ABTest category of the wcs_properties.json file.

    For more information, see A/B Test Properties in Property Files Reference for Oracle WebCenter Sites.

  • Any user that will be able to “promote the winner” of an A/B test must be given the MarketingAuthor role. Any user that will be able to view A/B test reports and stop tests must be given the MarketingEditorrole.

    For more information on configuring users, see Configuring Users, Profiles, and Attributes in Administering Oracle WebCenter Sites.

11.3 Creating A/B Tests

Creating an A/B Test involves setting up variants of the web page to be tested, then setting up the criteria that will be applied during the test.

11.3.1 Switching In and Out of A/B Test Mode

Working with A/B tests in Contributor is similar to editing in Web view. Two key differences are that in A/B test mode you make page changes to a test layer of variants rather than to site pages themselves, and an A/B Test panel displays at right.

For information about switching in and out of A/B test mode, see these topics:

11.3.1.1 Switching Into A/B Test Mode

This task assumes you are in Contributor Web View, shown by the word “Web” in the toolbar view control.
  1. Display the browser page or mobile page on which to add a test.
  2. Click the A/B Test icon in the upper right of the menu bar.
    The image shows the A/B Test icon.

    Note:

    When you hover the cursor over the icon, its tooltip indicates the number of tests that already exist for a page, if any (for example, "There are 2 A/B Tests for this page").

    If A/B Testing is not enabled for the page, the A/B test is still created, although you will see a warning message above the page.

If there are no A/B tests for a page, the panel at right contains a single control showing the text “Create a test”.

If one or more A/B tests already exist for a page, the panel at right shows the A/B Test controls and an option to change to the Criteria controls:

  • Use the A/B Test controls to create and edit tests for a page, and to track changes that you make to a page’s variants.

  • Use the Criteria controls to set the criteria for the selected test, such as its start and end determinants, which conversion to measure, and target and confidence information.

The controls in the toolbar also change when in A/B Test mode:

  • The toolbar label changes to A/B Test.

  • If the test has not yet been published, a Save icon and a Change Page Layout icon are available.

  • An Approve icon and a Delete icon are always available.

  • There are no Edit, Preview, Checkin/Checkout, or Reports icons.

11.3.1.2 Switching Out of A/B Test Mode

  • Click the X in the top right of the A/B Test panel.

If you have just created or made changes to the test without saving it, you are prompted to save the test before it is closed.

11.3.2 Creating, Copying, and Selecting A/B Tests

You can create A/B tests, make copies of them, and select existing tests.

For information about creating, copying, and selecting A/B tests, see these topics:

These tasks assume that you are working in A/B Test Mode.

11.3.2.1 Creating an A/B Test for a Page

  1. In the list box containing either the text "Create a test" or a list of existing tests, click the + (plus) button beside the list box.
  2. In the Create New Test window, enter a name for the test in the top or only field.
  3. Click Submit.
  4. In the A/B Test toolbar, click the Save icon.

11.3.2.2 Copying A/B Tests

  1. Click the + (plus) button beside the list of existing tests.
  2. In the Create New Test window, select a test to copy in the Copy field.

    In the top field, you can change the default name assigned.

  3. Click Submit.
  4. In the A/B Test toolbar, click the Save icon.

11.3.2.3 Selecting A/B Tests

  • Open the drop-down list of existing tests and select the one you want.

After creating, copying, or selecting a test, you can work with its variants (described in Adding/Selecting and Editing A/B Test Variants) and select its criteria. The criteria you select apply to all variants of the test.

11.3.3 Setting Up A/B Test Variants

11.3.3.1 Adding/Selecting and Editing A/B Test Variants

In this procedure you define the A/B test by selecting a variant in the panel and making edits to its webpage, then use the color-coding and numbering to identify the selected variant and its changes.

To add/select and edit test variants:

  1. Enter A/B Test mode (see Switching Into A/B Test Mode) and select a test to work on in the Test field at the top of the panel.

    Below the test description field, variants of the current webpage are listed. Initially, there is the base webpage A and a single variant of this webpage B. You can add further variants of the base webpage, which will become C, D, E, etc.

    A, the original webpage, is the control.

    The selected variant has its letter displayed in a color-coded circle. Clicking B displays its green circle, clicking C displays its yellow circle, and so on. As you modify the selected variant's webpage, circles of the same color display on the webpage, with numbers to indicate the changes.

    Changes made to a variant are listed in the Tracked Changes section at the bottom of the panel. The change numbers correspond to the color-coded numbers on the webpage.

    In addition to a reference-only name, each variant is identified by:

    • A letter. The B variant is already created and selected, ready for you to begin modifying it. (A is the control, the original webpage.)

    • A name. The name of the base webpage is intially "Base", and the names of the variants are initially "B", "C", "D", etc. You can change these names at any time.

    • A color-coded circle that displays to indicate the selected variant. For example, clicking B displays its green circle, clicking C displays its yellow circle, and so on. As you modify the selected variant's webpage, circles of the same color but with numbers display on the webpage.

    • Numbered changes. These display in the Tracked Changes section at the bottom of the panel. Changes are assigned numbers that correspond to the color-coded numbers on the webpage.

  2. Select or add a variant.
    • To select a variant, click its letter (for example, B). A circle in its assigned color displays.

    • To add a variant, click the + (plus) button below the last variant. A new variant is added and selected, and a circle in its assigned color displays. You can enter a name for the variant at any time. Your system may be configured to automatically save the variant at this point. If not, before switching to another variant, you must save the current variant, by clicking the Save icon on the toolbar.

  3. In the page editing area, make changes to the selected variant as you would change a page in web view. For example, you might change a headline or replace an image.

    You can use the search facility to find assets to include on a variant page. The search results panel will temporarily cover the A/B test panel. Move assets onto the variant page by dragging and dropping from the search results panel. Close the search panel to reveal the A/B test panel again.

    You can also perform these edits:

    As you make changes, they are listed, described, and numbered in the Tracked Changes section. Their numbers correspond to the numbers displayed in color-coded circles on the webpage. Changes you have made but not saved are listed in the Unsaved Changes section. As you move the mouse over each item in the Tracked Changes section, that item is highlighted on the webpage.

    You can click the (Hide)/(Show) control to switch between hiding and showing the numbered circles on the webpage. You can also drag and drop the numbered circles to reposition them if they block your webpage view.

  4. Save or discard the tracked changes:
    • To save all the changes you have made to the webpage variant, click the Save button in the A/B Test toolbar.

    • To discard any change in the Tracked Changes list (even saved ones), click the X (delete) control that appears next to the change description when you hover your mouse over it. You will be made to save the variant at the same time that the change is discarded.

  5. Save the test.

11.3.3.2 Changing the Page Layout of an A/B Test Variant

You may want a variant to use a different page layout from the control or other variants.

  1. After selecting a test and variant in the A/B Test panel, click the Change Page Layout icon in the toolbar.

    Figure 11-4 Page Layout Icon

    Description of Figure 11-4 follows
    Description of "Figure 11-4 Page Layout Icon"
  2. In the Change Page Layout window (Figure 11-5), select another page layout to use. The layouts selected for the control (A) and other variants are identified.

    Figure 11-5 Change Page Layout Window

    Description of Figure 11-5 follows
    Description of "Figure 11-5 Change Page Layout Window"
  3. After you select a page layout for the test variant, the dialog will refresh showing Cancel and Apply buttons. Click Apply to apply the current variant with the selected page layout. The change will show in the Tracked Changes table but will not have a number.

11.3.3.3 Deleting A/B Test Variants

To delete a variant, click its X. You are prompted to save if unsaved changes will be lost upon leaving the page.

11.3.4 Setting Up A/B Test Criteria

11.3.4.1 Specifying the Start and End of an A/B Test

When creating a test, you can specify a start and end for it. If you do not specify a start, the test will start as soon as it is published. Once a test starts running, you can view its current results, stop it early, and promote its winner, as described in Viewing A/B Test Results and Making Site Changes Based on A/B Test Results. You cannot edit a test after it is published.

  1. In A/B test view, click Criteria in the A/B Test panel (see Switching Into A/B Test Mode).
  2. Specify a test start date and time, in one of the following ways:
    • To start the test when A/B test assets are approved, leave the start field blank. The test will begin after publishing begins, as described in Approving and Publishing A/B Tests.

    • To start the test on a certain date, click the calendar next to the Start field and select a date from the calendar picker. By default, the start time is set to the current time. (If you select the current date and time, the test starts upon publish.) You can use the HH, MM, and SS fields to specify another start time, in which case select the date again to apply the new time. You can also enter a new value for the time by overwriting the one currently shown. Your server setup determines whether the start time uses a 12 or 24 hour format. If set for a 12 hour format, enter AM or PM.

  3. To end the test at a specific date and time, select Date in the End field, then click the calendar below and select a date and time from the calendar picker. Specify an end time the same way you specify a start time, as described above.
    The test will continue beyond the test end that you set here if you also set a confidence level that is not achieved by the test end point. See Selecting the Confidence Level of an A/B Test.
  4. Click Save.

11.3.4.2 Selecting the Confidence Level of an A/B Test

When creating a test, you can select a confidence level for its results. This number determines confidence in the significance of the test results, specifically that conversion differences are caused by variant differences themselves rather than random visitor variations. The larger the number of A/B test visitors, the easier it is to determine statistical significance in variant differences.

If you set a confidence level, the A/B test will continue until the set level is reached, even beyond the set end point of the test.

For information about confidence level calculation, see How the A/B Test Confidence Level is Calculated.

To select a test confidence level:

  1. In A/B test view, click Criteria in the A/B Test panel (see Switching Into A/B Test Mode).
  2. In the Confidence field, select or enter a confidence level percent. You can select percents ranging from 85% to 99.9%
  3. Click Save.

11.3.4.3 How the A/B Test Confidence Level is Calculated

The conversion rate refers to the number of visitors who performed the specified visitor action divided by the number of visitors to the page variant. This rate is calculated per variant.

The confidence interval is calculated using the Wald method for a binomial distribution. In the Wald Method (Figure 11-6), conversion rate is represented by p.

Figure 11-6 Wald Method of Determining Confidence Intervals for Binomial Distributions

Description of Figure 11-6 follows
Description of "Figure 11-6 Wald Method of Determining Confidence Intervals for Binomial Distributions"

The results for each variant are then used to calculate the Z-Score (Figure 11-7). The Z-score measures how accurate results are. A common Z-Score (confidence range) used in A/B Testing is a 3% range from the final score. However, this is only a common use, and any range can be used. The range is then determined for the conversion rate (p) by multiplying the standard error with that percentile range of standard normal distribution.

At this point the results must be determined to be significant; that is, that conversion rates are not different based on random variations. The Z-Score is calculated in this way:

Figure 11-7 Z-Score Calculation

Description of Figure 11-7 follows
Description of "Figure 11-7 Z-Score Calculation"

The Z-Score is the number of positive standard deviation values between the control and the test mean values. Using the previous standard confidence interval, a statistical significance of 95% is determined when the view event count is greater than 1000 and that the Z-Score probability is either greater than 95% or less than 5%.

11.3.4.4 Specifying the Goal to Track in an A/B Test

Marketers are required to create and use custom goals in Google Analytics for A/B testing. These goals must be created before setting up A/B tests. You can create goals of your own, or you can use goals created by others.

To specify the goal to track:

  1. In A/B test view, click Criteria in the A/B Test panel).
  2. Click Select Goal.
  3. In the Goals dialog box, choose a goal, then click Select.

    Goals are listed by name and type. If needed, click the Sort button to change their sort order. The selected goal's name displays under the Select Goal button.

  4. Click Save.

11.3.4.5 Specifying Visitors or Segments to Target in A/B Tests

You can randomly display an A/B test to a specified percent of either:

  • The entire visitor pool

  • One or more selected segments

For example, you might enter 50 to target the test to 50% of all visitors or to 50% of visitors in a segment of visitors 65 years or older. (For more information about segments, see Creating Segments.)

Within the specified percent, test variants display in equal percentages to the target visitors. For example, if a test targeting 30% of visitors includes two variants (B and C) in addition to the control (A), 10% of visitors would be shown control A, 10% variant B, and 10% variant C.

To specify visitors or segments to target:
  1. In A/B test view, click Criteria in the A/B Test panel.
  2. In the Target field, select Visitors to target any visitors, or Segments to target visitors who are part of a selected segment.
  3. If you selected Visitors in the previous step, enter the percentage of visitors to include in the test in the % of Visitors field that displays.
  4. If you selected Segments in the previous step, select one or more segments.
    • Click the Select Segments button that displays.

    • In the Target field in the Segments window, enter the percentage of visitors in selected segments to include in the test.

    • From the Segment Name column, select one or more segments. Click Sort to sort the segments list by name or modification date. To deselect a segment, click its x in the Selected Segments column.

    • Click Select. Selected segment names display under the Select Segments button.

  5. Click Save.

11.4 Approving and Publishing A/B Tests

After creating and editing your test's variants and specifying its criteria, the next step is to approve the test for publishing to the delivery system. The test is created in Google Analytics at this time. Uptil this point there is nothing in Google Analytics. After approving the test, you may log into Google and see the experiment and all its settings.

A few items to consider about approval:

  • Approving the test approves it and its variants. As with all webpage approvals, you must approve an asset and all of its dependencies.

  • The test must include a goal before you can approve it.

  • A/B tests are published to a destination that you select. This destination must itself already have been published, by an administrator. Consult your administrator if you find that no published destinations are available to you.

  • Upon publishing, the test will begin at the date and time you specified in the test's criteria. If that time has already passed, the test will start immediately.

To approve a test's assets for publishing and to start the test:

  1. In A/B test mode, select the test to approve.
  2. Click the Approve icon in the toolbar, and select a destination.
    The Approval screen lists all assets and dependencies.
  3. Approve all the assets and dependencies by clicking the Select All link, then clicking Approve With Dependencies.
For full details about publishing assets, see Approving and Publishing Content.

11.5 Using A/B Test Results

The results of your A/B tests are shown in A/B Test reports. You can use these to decide whether one of the variant pages you tested should be promoted to the active web site.

For information about using A/B test results, see these topics:

11.5.1 Viewing A/B Test Results

Once the test begins, whenever visitors view the control (A) or variant versions of the webpage, their site visit information is captured and these statistics become available in the A/B Test report. For completed tests, this report shows data that will let you compare the relative performance of the base and variant webpage designs. You can then use this information to decide whether to promote one of the variants to become the webpage that visitors see. For a full description of this report, see The A/B Test Report.

To view the A/B Test report in the Contributor interface:

  1. Search for A/B Tests.

    One way to do this is to select the search preset “A/B Test” from the drop-down list in the Search box, then click the Search button.

    For each test in progress, a search result box is shown containing a chart and additional information. This shows the latest results for the test, but is not the full report.

  2. In the lower right corner of the search result box, click the small book icon.
    This opens the A/B Test report, showing all data available.

    Depending on the status of the test, the button at the top right of this report will appear as follows:

    • If the test is in progress, the button will be red and say Stop. If you want to stop (and complete) the test, click this red Stop button. You cannot restart the test once it has stopped.

    • If the test is completed, the button will be green and say Promote. You can use the results shown in the report to decide if you want to promote a variant to be the page that is displayed to visitors. See Making Site Changes Based on A/B Test Results.

    • If the test is not published, the button will be green and say Edit. Click the button to edit the test variants further. You cannot edit a test once it is published.

11.5.2 The A/B Test Report

A/B Test reports show data that let you compare the relative performance of base and variant webpage designs. You can use this information to decide whether to promote one of the variants to become the webpage that visitors see.

Header Section

Figure 11-8 A/B Test Report: Header Section

A/B Test Report Header Section

Table 11-1 Features of the A/B Test Report Header Section

Feature Description

Product and report name area

Confirms that you are viewing an Oracle WebCenter Sites A/B Test Report.

Status area

Shows whether the report is in progress, complete, or not published.

Button

This is a red Stop button, a green Promote button, or a green Edit button.


Summary Section

Figure 11-9 A/B Test Report: Summary Section

A/B Test Report Summary Section

Table 11-2 Features of the A/B Test Report Summary Section

Feature Description

Information area

Shows the name of the A/B test, the name of the web page on which it is running, the owner of the test, and the test’s description.

Conversion

The type of conversion that the test is monitoring.

Segments

If the test includes a segmented user base, this describes the segmentation. “All” means the user base is not segmented.

Stop

When or how the test is intended to stop.

Chart

The conversion rate for each variant, displayed together so that they can be compared.


Metrics Bar

Figure 11-10 A/B Test Report: Metrics Bar

A/B Test Report Metrics Bar

Table 11-3 Features of the A/B Test Report Metrics Bar

Feature Description

Target

The percentage of all visitors, the end date, and the confidence percentage, as set up for the test.

Summary to date

The number of visitors served, the number of conversions, and the confidence percentage reached so far, for all variants combined. If the test allows one variant to be identified as better than the others, this is shown as the winner, when the test is complete.


Conversions Section

Figure 11-11 A/B Test Report: Conversions Section

A/B Test Report Conversions Section

Table 11-4 Features of the A/B Test Report Conversions Section

Feature Description

Select criteria

Normally set to Device.

Display

Conversion information for each variant.


Confidence Section

Figure 11-12 A/B Test Report: Confidence Section

A/B Test Report Conversion Section

Table 11-5 Features of the A/B Test Report Confidence Section

Feature Description

Chart

For each variant, the number of visitors, the number of conversions, the conversion percentage, the Z-score (see How the A/B Test Confidence Level is Calculated), and the confidence figure.


11.5.3 Making Site Changes Based on A/B Test Results

Based on test results, you may decide to promote a particular test variant, which permanently replaces the control with the selected winning variant and displays it to all visitors.

To promote a variant:

  1. Display the A/B test's report, as described in Viewing A/B Test Results, and click Promote.
  2. From the list of test variants, select the variant to promote. (The Base web page A is not available for selection, because it is the control.)
    You determine which variant to promote. For example, the best choice might have a lower conversion rate but highest results from mobile device visitors.
  3. Click Promote.
  4. In the form that displays the page information to publish, click Promote.
    To display the selected variant on the website for all visitors, you must now publish it. See Approving and Publishing A/B Tests.

11.6 Deleting an A/B Test

When you delete an A/B test, you will not be able to access any of the reports that were generated for it. However, the data is not destroyed and you can access EID directly and create your own reports to retrieve it.

If you attempt to delete an A/B test that is running, you will see a warning before you can complete the deletion.

Note:

The experiment that is created in Google is not deleted even if you have deleted the A/B test by performing the subsequent steps. To remove an experiment from Google, log into the Google Analytics interface and delete it.

To delete an A/B Test:

  1. In the A/B Test toolbar, click the Delete button.

    The web page is hidden and a Delete Asset(s) panel is shown. At this stage you can cancel the deletion by clicking the Go Back button in the toolbar.

  2. Select the assets that you want to delete.
  3. Click the Delete button. You will not see a warning unless the A/B test you are attempting to delete has been published.

You can return to the web page you were testing by clicking the Go Back button in the toolbar.