51 Developing for A/B Testing

With the Oracle WebCenter Sites: A/B Testing feature, marketers can test variants of website pages (both design and content variations) to determine which page variants produce best results.

For information on how marketers create and run A/B tests, see Working With A/B Testing in Using Oracle WebCenter Sites.

Topics:

51.1 A/B Testing Prerequisites

What you need to do to be able to use the A/B Testing feature is, include the A/B Testing code element in the templates, enable the A/B Testing property in the wcs_properties.json file, and obtain Google Analytics IDs.

Note:

A/B test functionality for this Oracle WebCenter Sites product is provided through an integration with Google Analytics. To use the A/B test functionality for this Oracle product, you must first register with Google for a Google Analytics account that will allow you to measure the performance of an “A/B” test.  In addition, you acknowledge that as part of an “A/B” test, anonymous end user information (that is, information of end users who access your website for the test) will be delivered to and may be used by Google under Google’s terms that govern that information.
Before the A/B Testing feature can be used, a developer must do the following:
  1. Include the A/B code element in templates on which A/B tests will be run. See Scripting Templates for A/B Testing.

  2. Set the property abtest.delivery.enabled to true on the delivery instance, that is any instance that will deliver the A/B test variants to site visitors. The property should not be set on instances that are used only to create the A/B tests, because this will give false results. The abtest.delivery.enabled property is in the ABTest category of the wcs_properties.json file.

    See A/B Test Properties in the Property Files Reference for Oracle WebCenter Sites.

  3. Obtain all Google Analytics IDs and add them to the configuration.

Other actions must be taken by users and administrators as prerequisites for A/B testing. For a complete list, see Before You Begin A/B Testing in Using Oracle WebCenter Sites.

51.2 Scripting Templates for A/B Testing

You need to add a single line of code, a call element, to allow A/B Testing to operate. The element code you include will generate JavaScript code on the page. When the tested page loads in the browser, it will call back to the server to see if there are any A/B tests on this page.

This line must be added to templates on those pages to be used in A/B Testing. Many sites incorporate a design that includes a shared template on all pages. If your site uses this design, then it will only be necessary to add the call element to the single template. In the avisports sample website, a template named Head includes the call element.

To edit the template, follow these steps for each site that will use A/B Testing:

  1. In the Admin interface, search for and open the template.
  2. Click the Edit icon to open the template form for editing. Select Element.

    The Element form within the template is displayed.

    Figure 51-1 Element Fields for Head Template

    Description of Figure 51-1 follows
    Description of "Figure 51-1 Element Fields for Head Template"
  3. In the Element Logic field, add the following line:

    <render:callelement elementname="fatwire/includeABTesting" args="c,cid,pagename"/>

    It is recommended that you add this at the top of the code. This will ensure that this JavaScript is active before anything on the page is rendered.

    Note:

    The preceding sample code describes how to include the includeABTesting element using the render:callelement tag in a template. Alternatively, you can include the element using Fragment API in a Controller. See Developer’s Samples Website.
  4. Save the template.

From here, you need to add ab to the Cache Criteria for the page templates used on the site.

51.3 Updating Cache Criteria for A/B Testing

After you have added the call element code to templates to enable A/B Testing, you need to add ab to the Cache Criteria for the page templates used on the tested pages.

In the avisports sample website, the two page templates used are SectionLayoutGreen and SectionLayoutOrange. You will need to make this change on all page templates used on your site that are used in A/B Testing.

Note:

Creating a new template adds ab to the cache criteria by default.
To update the page template:
  1. In the Admin interface, find and open the page template used on your website in A/B Testing.
  2. Click the Edit icon to open the page template for editing. Select the Site Entry screen.

    The Site Entry form for the template displays.

    Figure 51-2 Site Entry for A/B Testing Template

    Description of Figure 51-2 follows
    Description of "Figure 51-2 Site Entry for A/B Testing Template"
  3. In the Cache Criteria field, enter ab to the beginning of the list of comma separated items. Do not delete any other entries.
  4. Save the template.

    Repeat these steps for all page templates used on pages in A/B Testing.

51.4 Viewing A/B Test Details as JSON

Each A/B Testing asset, used through the WCS_ABTest asset type, contains a field that specifically lists the changes made between the original web page (the A page) and the test page (the B page). These changes are referred to as the differential data, and are stored in JSON format in the Variations (JSON) field.

To view the differential data:

  1. In the Contributor interface, open the A/B Testing asset in Form mode.
  2. Select the Content tab.

    The asset is displayed in Form mode.

  3. The JSON used is displayed in the Variations (JSON) field. This describes all differences between the A and the B (and additional, if used) test pages.

If needed, the differential data stored in the JSON can be edited directly, although this is not recommended for contributors. Copy the JSON to a proper JSON editor, and make the changes necessary, then paste back in the Variations (JSON) field.

51.5 Understanding Confidence Algorithms

When marketers create A/B tests, they can select a confidence level for their results. This number determines confidence in the significance of the test results, specifically that conversion differences are caused by variant differences themselves rather than random visitor variations.

See Selecting the Confidence Level of an A/B Test in Using Oracle WebCenter Sites. This section provides details on how conversion confidence is calculated.

The conversion rate is the conversion event count divided by the view count. This is typically represented by p. The percent change of the conversion rate is determined by subtracting the p of page A from p of page B and then dividing by the p of page A. This is calculated per-user. For example, a user who is converted 1000000 times will only be counted once. The algorithm used to determine confidence is Z-Score.

The word "confidence" in this context is used to refer to a statistical computation used to determine how confident we are that the difference between the results for A and the results for B are actually caused by variations on the pages, and not just random variations between visitors.

The confidence interval is calculated using the Wald method for a binomial distribution.

Figure 51-3 Wald Method of Determining Confidence Intervals for Binomial Distributions

Description of Figure 51-3 follows
Description of "Figure 51-3 Wald Method of Determining Confidence Intervals for Binomial Distributions"

The larger the sample, the greater the confidence in the results. A common threshold used in A/B Testing is a confidence interval with a 3% range from the final score. However, this is only a common use, and any plus/minus range can be used. The confidence interval is then determined for the conversion rate (p) by multiplying the standard error with that percentile range of standard normal distribution.

At this point the results must be determined to be significant; that is, that conversion rates are not different based on random variations. The Z-Score is calculated in this way:

Figure 51-4 Z-Score Calculation

Description of Figure 51-4 follows
Description of "Figure 51-4 Z-Score Calculation"

The Z-Score is the number of positive standard deviation values between the control and the test mean values. Using the previous standard confidence interval, a statistical significance of 95% is determined when the view event count is greater than 1000 and that the Z-Score probability is either greater than 95% or less than 5%.