Viewing Model Metrics

To view model metrics:

  1. From Oracle Data Relationship Management Analytics, select Model.
  2. Click the filter bar and set filter options:
    • Model––Click popup search list button, select models, and then click OK.

    • Model Status–– Select a status:

      • Active and Hidden

      • Hidden Only

      • Active Only

    • Timeframe––Select a predefined timeframe or select Custom, and then select a Start Date. End Date is optional.

  3. Click Apply.

    Tip:

    Click filter save button to save the current filter settings, reset to default saved filter settings, or clear the current filter selections.

  4. Set how you want to view metrics in the Model Analysis dashboard:
    • Model Performance View model performance view––You can view performance in the List View list view or the Service Level Agreement Plot view service level agreement plot. In the List View, models matching the filter selections are displayed with the following information: number of requests, average cycle time for the requests to complete, average number of request items, and average number of participants. The color of the bars in the List View correspond to the colors in the pie chart. Click the name of a model to drill into specifics for the model. The List View can be sorted by Count, Avg Cycle Time, or Label. Cycle time is the request completion time minus the request submit time. When sorted by Avg Cycle Time, the model with the largest average cycle time is listed first.

      In the List View, click on a model name to drill into the model. In the Service Level Agreement plot, click on a point in the graph to drill into the model. The following tabs are available in the Model Performance View:

      • Model Overview tab model overview tab––Provides overview information for the model and allows you to drill back into requests in Oracle Data Relationship Management.

        The following information is provided on the Model Overview tab:

        • Clickable segmented color bar to drill into requests based on due status––After clicking the bar, click export to excel to export results to an Excel spreadsheet or click drill back to DRM next to a request to drill back into the request in Data Relationship Management.

          After view content on this screen, click Back in the top right corner of the screen to return to the Model Overview screen. Do not use the browser Back button.

        • Request Duration––Request duration setting value for the filtered model. The value displayed is dynamic and will change if changed within the model. Displayed for reference when viewing additional metrics.

        • Claim Duration––Current claim duration setting value for the filtered model. The value displayed is dynamic and will change if changed within the model. Displayed for reference when viewing additional metrics.

        • Unclaimed Duration––Percentage of time spent in unclaimed status across all requests for the selected model

        • Cycle Time––Average time it took all requests for the selected model to complete

        • Participants––Average number of unique participants across all requests for the selected model

        • Conditional Stages-—Number of stages in the model that are conditional. For example, in a 4 stage model that has 3 conditional stages, "3 of 4" is displayed.

        • Separation of Duties Enabled––Number of stages for which separation of duties is enabled

        • SLA Violations––Number of requests that exceeded the request duration or due date at the request level

        • Slippages––Number of requests that had at least one claim duration violation

        • Pushbacks––Number of requests that had at least one push back

        • Escalations––Number of requests that had at least one escalation

        • Auto-skipped––Number of requests with at least one conditional Enrich or Approve stage that was skipped

        • Auto-approved––Number of requests with at least one Enrich or Approve stage that was promoted (approved by the system)

        • Auto-committed––Number of requests with a skipped or promoted commit stage

          Note:

          Auto-Committed counts are not included in the Auto-Skipped or Auto-Approved counts.

      • Process Efficiency tab process efficiency tab––Provides a breakdown analysis for each stage in a workflow model. Identifies the bottlenecks, resource requirements, and participant workload.

        The following metrics are provided for the filtered model:

        • Average cycle time

        • Percentage of claimed vs. unclaimed duration

        • Request duration

        • Request claim duration

        • Total # of requests

        • Total resources

        The following metrics are provided for each stage of the filtered model:

        • Average cycle time

        • Percentage of cycle time the stage was idle.

        • Number of requests that were delayed (where the claim duration had elapsed)

        • Number of person hours for the resources committed

        • Average number of participants

      • Throughput tab throughput tab––Shows a graphical representation of the submitted requests for the selected model.

      • Participant tab participant tab––Shows participant metrics for a single model. The views can be filtered to include a minimum number of requests or items. Both views can be sorted by the percentage committed, request items, or participant name. You can click on the segmented bar or the color coded value to drill into the request list for rejected or committed requests.

    • Request Throughput View request throughput view––Shows a graphical representation of the subbmitted requests for the filtered models. The default view shows metrics for all submitted, committed, and rejected requests for the selected timeframe. You can filter the display to show only Committed, Rejected, or Submitted requests.

    • Participant Performance View participant performance view––Shows metrics by two views: Submitted By and Committed By. You can determine which participants are processing the most or fewest number of requests. Click submitted by to view metrics based on users who submitted requests. Click completed by to view metrics based on users who completed requests. The views can be filtered to include a minimum number of requests or items. Both views can be sorted by the percentage committed, request items, or participant name.