Tune Query Performance

You can tune query performance to optimize queries and speed up query execution times.

About Queries and Performance

When an analysis runs, database aggregation and processing is prioritized so that there's as little data as possible sent to the BI Server for more processing. When creating analyses, you should reinforce this principle to maximize performance.

Here's what happens when a query runs, in three phases:

  1. SQL compiles.
  2. Database SQL runs.
  3. Result set is retrieved and displayed in an analysis.

Here's the query architecture with the steps that happen when a query runs.

Description of otbi_query_arch_phases.png follows
Description of the illustration otbi_query_arch_phases.png
Phase Step Description
1 1 You create an analysis with columns from one or more subject areas, add appropriate filters and graphs, and then run it.
1 2 Presentation Services receives the request and sends a logical SQL statement based on subject areas and columns to the BI Server.
1 3 The BI Server does these things:
  1. Correlates the logical SQL with view objects and view links mapped in the metadata repository.
    • The SQL goes to the WebLogic ADF layer to get view object and view link SQL definitions.
    • Oracle Fusion Cloud Applications security is applied to these SQL definitions.
  2. Aggregates the view object query to create physical database SQL.
2 4 The database query, including the aggregations and data security, runs in the database.
2 5 The aggregated dataset is returned to the BI Server, which merges the results and applies any additional calculations or filters.
3 6 The final dataset is returned to Presentation Services, where it's formatted for display and shown in the analysis.

Maximize Query Performance

The way you design an analysis affects how the BI Server builds the physical query, which determines how much processing is focused in the database to maximize performance.

You should consider some of the factors described here, while creating ad-hoc analyses.

Here are some key things you can do to improve how your analyses perform:

  • Select only required subject areas and columns. What you select determines which view objects and database tables are used in the database query. Any unnecessary table means more queries and joins to process.
  • Add proper filters to use database indexes and reduce the result data returned to the BI Server.
  • Remove unnecessary visual graphs.
  • Remove unused data columns. Any columns not used in visual graphs are included in the physical SQL execution, which may increase database query processing overhead.

Here are some of the factors that can hurt query performance, and what you can do to improve it.

Factor Description Suggestions
Security Analyses may perform well for a user with broad security grants, but worse for those with more restrictions. Review and simplify your security. Minimize the number of roles and don't use complex custom queries to enforce data security.
Cross-subject areas Analyses including two or more subject areas can impact performance. Review your analyses to see if you can remove some of the subject areas in the queries. Are all of the subject areas required? When you built the analysis, did you notice a performance impact while adding a particular subject area?
Hierarchies Hierarchies, particularly large ones, can impact performance.

Queries on trees and hierarchical dimensions such as manager can have an impact on performance. The BI Server uses a column-flattening approach to quickly fetch data for a specific node in the hierarchy. But there's no pre-aggregation for the different levels of the hierarchy.

Remove hierarchies to see if performance improves. It's also important to carefully craft any query to apply enough filters to keep the result set small.
Number of attributes Analyses often use a large number of columns that aren't required. Reduce the number of columns in the criteria as much as possible.
Flexfields Using too many flexfields in analyses can hurt performance in some cases. Remove flexfields to see if the performance improves. Avoid flexfields in filters.
Data volumes Analyses that query large volumes of data take longer. Some may query all records, but only return a few rows, requiring a lot of processing in the database or BI Server. Consider adding filters in your analysis,especially on columns with indexes in thedatabase.

Avoid blind queries because they're runwithout filters and fetch large data sets.All queries on large transactional tablesmust be time bound. For example, includea time dimension filter and additionalfilters to restrict by key dimensions, suchas worker. Apply filters to columns thathave database indexes in the transactionaltables. This ensures that a good executionplan is generated for the query from the BI Server.

Subquery (NOT IN, IN) Filtering on IN (or NOT IN) based on the results of another query means that the subquery is executed for every row in the main query. Replace NOT IN or IN queries with union set operators or logical SQL.
Calculated measures Calculating measures can involve queryinga lot of data. Use the predefined measures in yoursubject areas wherever possible.
Filters Analyses with no filters or filters thatallow a large volume of data may hurtperformance. Add filters and avoid applying functions orflexfields on filter columns. If possible, useindexed columns for filters.

Refer to OTBI subject area lineage for your offering, which documents indexed columns for each subject area.

Review further guidelines about analysis and reporting considerations in My Oracle Support (Doc ID 2679006.1).

Diagnose Performance Using Subject Areas

These examples illustrate how you could use the OTBI Usage Real Time and Performance Real Time subject areas to understand usage and performance, so that you can diagnose performance bottlenecks and understand whether analyses are running slowly or could be optimized.

The OTBI Usage Real Time subject area monitors usage trends for OTBI by user, analysis and dashboard, and subject area. The OTBI Performance Real Time subject area monitors usage as well as analysis execution time, execution errors, and database physical SQL execution statistics. See OTBI Usage and Performance Subject Areas Reference.

The OTBI usage subject areas are provisioned for two job roles: IT Security Manager and Application Implementation Consultant. To use them for other job roles, grant OTBI Usage Transactional Analysis Duty and OTBI Performance Transactional Analysis Duty to custom job roles in the Security Console. Refer to specific subject area guides to see security details for each subject area.

Analyze the Number of Users

Use this example to understand whether the number of users might impact performance.

In this example, you create an analysis to determine the current number of users accessing OTBI so that you can determine system load, and a histogram analysis that identifies trends in long-running queries.

  1. In the Reports and Analytics work area, click Create and select Analysis. Select the OTBI Usage Real Time subject area and click Continue
  2. In the Select Columns page, expand your subject area and folders. Select User Count from the Facts - Usage Metrics folder and click Next. Because you aren't including a dimension in the analysis this provides a total of all users.
  3. In the Select Views page, select a Table view and click Next.
  4. In the remaining pages, click Next to accept the defaults.
  5. In the Save page, enter a name for the analysis, select a catalog folder to save it in, and click Submit.

    The result is a single-column table with the number of users on the system. To refine this analysis, you could add the Report Name and Report Path columns from the OTBI Report folder to the analysis to determine which reports are in use.

Analyze Query Performance

Use this example to diagnose performance bottlenecks and understand whether analyses are running slowly or could be optimized.

  1. In the Reports and Analytics work area, click Create and select Analysis. Select the OTBI Performance Real Time subject area and click Continue
  2. In the Select Columns page, expand your subject area and folders. Select Report Name from the OTBI Report folder and click Next.
  3. Add Total Execution Time from the Derived Metrics folder in the Facts - Performance Tracking folder two times.
  4. Add Report Row Count from the Query Execution Metrics folder. Click Next.
  5. Add Report Count from the Query Execution Metrics folder. Click Next.
  6. In the Select Views page, select a Pivot view and click Next.
  7. In the remaining pages, click Next to accept the defaults.
  8. In the Save page, name the analysis Performance Histogram, select My Folders, and click Submit.
  9. Click Catalog and, in the catalog, navigate to My Folders and click Edit for your analysis.
  10. Select the Criteria tab. Click Options and select Sort Ascending for the first Total Execution Time column.
  11. Click Options for the second Execution Time and select Edit Formula. In the Edit Formula dialog, select Custom Headings enter Total Execution Time Bin, and enter the following statement in the Column Formula to bin by ranges of execution time, ranging from less than five seconds to greater than five minutes:
    CASE WHEN "Derived metrics"."Total Execution Time" <= 5 THEN 'Less than 5Seconds' WHEN "Derived metrics"."Total Execution Time" BETWEEN 5 AND 30 THEN'Between 5 and 30 Seconds' WHEN "Derived metrics"."Total Execution Time"BETWEEN 30 AND 60 THEN 'Between 30 and 60 seconds' WHEN "Derived metrics"."Total Execution Time" BETWEEN 60 AND 120 THEN 'Between 60 and 120 seconds' WHEN "Derived metrics"."Total Execution Time" BETWEEN 120 AND 300 THEN 'Between 120 and 300 seconds' ELSE 'Greater than 300 seconds' END
  12. In the Results tab, click Edit View for the pivot table.
  13. In the Layout pane of the Pivot Table editor:
    • Move the Total Execution Time Bin column to the Rows section.
    • Move Report Count to the Measures section.
    • Move the rest of the columns to the Excluded section so they aren't shown in the pivot table.
  14. Click Done.

    You can present this information as a graph or table to view how many reports are running too long. Those with the highest usage and the longest execution times can then be prioritized.

Tune Performance for Reports

You can use SQL query tuning to improve the performance of reports.

Here are some factors that can slow down query performance, and some suggestions for improvement.

Factor Description Suggestions
Filters Reports with no filters or filters that allow a large volume of data may hurt performance. Use filter conditions to limit data.
Joins Reports that join a lot of tables can run slowly. Remove any unnecessary joins.
Data volumes Reports with no filters or filters that allow a large volume of data may hurt performance. Add filter conditions to limit data, preferably using columns with database indexes. Use caching for small tables.
Indexes Filters that use database indexes can improve performance. Use SQL hints to manage which indexes are used.
Sub-queries Sub-queries can impact performance.
  • Avoid complex sub-queries and use Global Temporary Tables where necessary.
  • Avoid too many sub-queries in where clauses if possible. Instead, rewrite queries with outer joins.
Aggregation It helps performance to prioritize aggregation in the database.
  • Use Oracle SQL Analytical functions for multiple aggregation.
  • Use CASE statements and DECODE functions for complex aggregate functions.

Review SQL Statements Used in Analyses

You can review logical and physical SQL statements using these steps.

Logical SQL is non-source specific SQL that's issued to the BI Server for an analysis. Logical queries use column names from the subject areas in the Presentation Layer in the repository (RPD) metadata. Based on the logical request, the BI Server issues optimized source-specific SQL to the actual data sources in the Physical Layer of the metadata. If you have administrative privileges, you can review both logical and physical SQL for analyses.

  1. If you want to view the logical SQL when editing an analysis:
    1. In the Reports and Analytics work area, click Edit to open the analysis for editing, then click the Advanced tab.
    2. In the SQL Issued section, review the logical SQL statement.
  2. If you want to view the SQL statement from the Administration page.
    1. Click My Profile and then click Administration.
    2. On the Administration page, in the Session Management section, click the Manage Sessions link.

      You must be a Business Intelligence Administrator to access the Administration and Manage Sessions page.

    3. On the Manage Sessions page, in the Action column, click the View Log link to review the SQL statement.