When you want to gather data about a site activity, you design a metric. You can think of a metric as the statistic you want produced from a test that’s used in a report. A test can produce one metric or several metrics. For example, consider a test that provides two different Refer a Friend forms to users. Users are divided into two groups based on the Refer a friend page they are shown. The goal of this test is to see if one form entices users to refer more friends as judged by the number of form submissions. This test would produce two metrics:

Each metric needs three pieces of information defined for it: an event, logging data, and reporting data. When you are creating a metric, consider what event you want ATG Campaign Optimizer to watch and extract information from. For example, tracking user registrations would involve a Registration event. A custom metric can involve any event, such as an event you need to create yourself or have created previously for use in ATG Scenarios. It may also involve existing events that are not currently used for metrics. Using any such events requires you to create custom metrics.

Logging resources describe what information generated from the test should be saved and later, evaluated into metrics. Some of this data can describe the users in your test, such as their profile and participation group IDs. Other information can be generated from an event itself, such as the amount of a user’s order. The log holds raw data about every user involved in a test.

The reporting mechanism manipulates the log data, which usually involves adding up some number of rows in a table or adding the values in certain columns in the logging database table, and displaying the results in the Results tab of the ATG Campaign Optimizer user interface. One metric reports how many users participated in the test, broken down by group. Another more complicated metric might describe how many users in each group registered for the site. Although these examples use some of the same data, such as the group ID and profile IDs, the way they are tallied by the ReportGeneratorService produce different results, each one its own metric.