9Maintenance and Migration

This chapter contains the following:

Queries and Performance

When an analysis runs, database aggregation and processing is prioritized so that there's as little data as possible sent to the BI Server for more processing. When creating analyses, you should reinforce this principle to maximize performance.

Here's what happens when a query runs, in three phases:

  1. SQL compiles

  2. Database SQL runs

  3. Data set is retrieved and displayed in an analysis

Here's the query architecture with the steps that happen when a query runs.

Query process flow diagram describes query flow
from analysis creation, including the focus on database query processing.
Phase Step Description



You create an analysis with columns from one or more subject areas, add appropriate filters and graphs, and then run it.


The BI Presentation Service receives the request and sends a logical SQL statement based on subject areas and columns to the BI Server.


The BI Server does these things:

  1. Correlates the logical SQL with view objects and view links mapped in the metadata repository.

    • The SQL goes to the WebLogic ADF layer to get view object and view link SQL definitions.

    • Oracle Applications Cloud security is applied to these SQL definitions.

  2. Aggregates the view object query to create physical database SQL.



The database query, including the aggregations and data security, runs in the database.


The aggregated data set is returned to the BI Server, which merges the results and applies any additional calculations or filters.



The final data set is returned to the BI Presentation Service, where it's formatted for display and shown in the analysis.

How You Maximize Performance

The way you design an analysis affects how the BI Server builds the physical query, which determines how much processing is focused in the database to maximize performance. You should also consider some other factors while creating ad-hoc analyses.

ere are some key things you can do to improve how your analyses perform:

  • Select only required subject areas and columns. What you select determines which view objects and database tables are used in the database query. Any unnecessary table means more queries and joins to process.

  • Add proper filters to use database indexes and reduce the result data returned to the BI Server.

  • Remove unnecessary visual graphs.

  • Remove unused data columns. Any columns not used in visual graphs are included in the physical SQL execution, which may increase database query processing overhead.

Here are some of the factors that can hurt query performance, and what you can do to improve it.

Factor Description Suggestions


Analyses may perform well for a user with broad security grants, but worse for those with more restrictions.

Review and simplify your security. Minimize the number of roles and don't use complex custom queries to enforce data security.

Cross-subject areas

Analyses including two or more subject areas can impact performance.

Review your analyses to see if you can remove some of the subject areas in the queries. Are all of the subject areas required? When you built the analysis, did you notice a performance impact while adding a particular subject area?


Hierarchies, particularly large ones, can impact performance.

Queries on trees and hierarchical dimensions such as manager can have an impact on performance. The BI Server uses a column-flattening approach to quickly fetch data for a specific node in the hierarchy. But there's no pre-aggregation for the different levels of the hierarchy.

Remove hierarchies to see if performance improves. It's also important to carefully craft any query to apply enough filters to keep the result set small.

Number of attributes

Analyses often use a large number of columns that aren't required.

Reduce the number of columns in the criteria as much as possible.


Using too many flexfields in analyses can hurt performance in some cases.

Remove flexfields to see if the performance improves. Avoid flexfields in filters.

Data volumes

Analyses that query large volumes of data take longer. Some may query all records, but only return a few rows, requiring a lot of processing in the database or BI Server.

Consider adding filters in your analysis, especially on columns with indexes in the database.

Avoid blind queries because they're run without filters and fetch large data sets. All queries on large transactional tables must be time bound. For example, include a time dimension filter and additional filters to restrict by key dimensions, such as worker. Apply filters to columns that have database indexes in the transactional tables. This ensures that a good execution plan is generated for the query from the BI Server.

Subquery (NOT IN, IN)

Filtering on IN (or NOT IN) based on the results of another query means that the subquery is executed for every row in the main query.

Replace NOT IN or IN queries with union set operators or logical SQL.

Calculated measures

Calculating measures can involve querying a lot of data.

Use the predefined measures in your subject areas wherever possible.


Analyses with no filters or filters that allow a large volume of data may hurt performance.

Add filters and avoid applying functions or flexfields on filter columns. If possible, use indexed columns for filters.

Refer to OTBI subject area lineage for your offering, which documents indexed columns for each subject area.

Diagnose Performance Using Subject Areas

This example illustrates how you could use the OTBI Usage Real Time and Performance Real Time subject areas to understand usage and performance, so that you can diagnose performance bottlenecks and understand whether analyses are running slowly or could be optimized.

The OTBI Usage Real Time subject area monitors usage trends for OTBI by user, analysis and dashboard, and subject area. The OTBI Performance Real Time subject area monitors usage as well as analysis execution time, execution errors, and database physical SQL execution statistics.

In this example, you create an analysis to determine the current number of users accessing OTBI so that you can determine system load, and a histogram analysis that identifies trends in long-running queries.

Analyze the Number of Users

  1. In the Reports and Analytics work area, click Create and select Analysis. Select the OTBI Usage Real Time subject area and click Continue.

  2. In the Select Columns page, expand your subject area and folders. Select User Count from the Facts - Usage Metrics folder and click Next. Because you aren't including a dimension in the analysis this provides a total of all users.

  3. In the Select Views page, select a Table view and click Next.

  4. In the remaining pages, click Next to accept the defaults.

  5. In the Save page, enter a name for the analysis, select a catalog folder to save it in, and click Submit. The result is a single-column table with the number of users on the system. To refine this analysis, you could add the Report Name and Report Path columns from the OTBI Report folder to the analysis to determine which reports are in use.

Analyze Query Performance

  1. In the Reports and Analytics work area, click Create and select Analysis. Select the OTBI Performance Real Time subject area and click Continue.

  2. In the Select Columns page, expand your subject area and folders. Select Report Name from the OTBI Report folder and click Add.

  3. Add Total Execution Time from the Derived Metrics folder in the Facts - Performance Tracking folder two times.

  4. Add Report Row Count from the Query Execution Metrics folder. Click Next

  5. Add Report Count from the Usage Metrics folder. Click Next

  6. In the Select Views page, select a Pivot view and click Next

  7. In the remaining pages, click Next to accept the defaults.

  8. In the Save page, name the analysis Performance Histogram, select a My Folders, and click Submit.

  9. Click Browse Catalog and, in the catalog, navigate to My Folders and click Edit for your analysis.

  10. Select the Criteria tab. Click the Options button and select Sort Ascending for the first Total Execution Time column.

  11. Click the Options button for the second Execution Time and select Edit Formula. In the Edit Formula dialog box, select Custom Headings enter Total Execution Time Bin, and enter the following statement in the Column Formula to bin by ranges of execution time, ranging from less than five seconds to greater than five minutes.:

    CASE  WHEN "Derived metrics"."Total Execution Time" <= 5 THEN 'Less than 5
    Seconds' WHEN "Derived metrics"."Total Execution Time" BETWEEN 5 AND 30 THEN
    'Between 5 and 30 Seconds' WHEN "Derived metrics"."Total Execution Time"
    BETWEEN 30 AND 60 THEN 'Between 30 and 60 seconds' WHEN "Derived metrics".
    "Total Execution Time" BETWEEN 60 AND 120 THEN 'Between 60 and 120 seconds' 
    WHEN "Derived metrics"."Total Execution Time" BETWEEN 120 AND 300 THEN 
    'Between 120 and 300 seconds' ELSE 'Greater than 300 seconds' END

  12. In the Results tab, click Edit View for the pivot table.

  13. In the Layout pane of the Pivot Table editor:

    • Move the Total Execution Time Bin column to the Rows section.

    • Move Report Count to the Measures section.

    • Move the rest of the columns to the Excluded section so they aren't shown in the pivot table.

  14. Click Done.

    You can present this information as a graph or table to view how many reports are running too long. Those with the highest usage and the longest execution times can then be prioritized.

How to Tune Performance for Reports

SQL query tuning can improve the performance of reports. Diagnose issues and an explain plan. Here are some factors that can slow down query performance, and some suggestions for improvement.

Factor Description Suggestions


Reports with no filters or filters that allow a large volume of data may hurt performance.

Use filter conditions to limit data.


Reports that join a lot of tables can run slowly.

Remove any unnecessary joins.

Data volumes

Reports with no filters or filters that allow a large volume of data may hurt performance.

Add filter conditions to limit data, preferably using columns with database indexes. Use caching for small tables.


Filters that use database indexes can improve performance.

Use SQL hints to manage which indexes are used.


Sub-queries can impact performance.

  • Avoid complex sub-queries and use Global Temporary Tables where necessary.

  • Avoid too many sub-queries in where clauses if possible. Instead, rewrite queries with outer joins.


It helps performance to prioritize aggregation in the database.

  • Use Oracle SQL Analytical functions for multiple aggregation.

  • Use CASE statements and DECODE functions for complex aggregate functions.

Review SQL Statements Used in Analyses

You can review logical and physical SQL statements using either of the following procedures.

Logical and Physical SQL

Logical SQL is non-source specific SQL that's issued to the Oracle BI Server for an analysis. Logical queries use column names from the subject areas in the Presentation Layer in the repository (RPD) metadata. Based on the logical request, the BI Server issues optimized source-specific SQL to the actual data sources in the Physical Layer of the metadata. If you have administrative privileges, you can review both logical and physical SQL for analyses.

Edit Mode

  1. Open the analysis in Edit mode and click the Advanced tab.

  2. In the SQL Issued section, review the logical SQL statement.

Administration Page

  1. On the Administration page, in the Session Management section, click the Manage Sessions link.

    Note: You must be a Business Intelligence Administrator to access the Administration and Manage Sessions page.
  2. On the Manage Sessions page, in the Action column, click the View Log link to review the SQL statement.

Archive and Move Analytics

You can archive your analytic objects and move them to a new location. You can bundle the whole catalog, specific folders, or multi-component objects as a .catalog file and upload the file to unarchive its data in another location.

Note: It's important to note that folders that contain seeded content should not be changed. Moving or adding a child folder to a seeded folder results in analytics not being visible in dashboards or infolets.

Create an Archive

  1. Locate the object in the catalog.

  2. Select More and then select Archive.

  3. In the Archive dialog box, select one or more of the following options:

    • Keep Permissions: Maintain the object or folder's existing permissions. If you don't select this, the archiving process doesn't include any permissions. Once unarchiving, the parent folder's permissions are assigned to all of the objects and folders.

    • Keep Time stamps: Maintain the Creation Time, Last Modified, and Last Accessed times assigned to the object or folder. Upon unarchiving, the Last Modified time is updated to indicate the time when the object or folder is unarchived. If you select this option, the Old option in the Paste Overview area of the Preferences dialog box is available when unarchiving. Use the Old option to overwrite existing catalog items older than the items in the archive.

      If you don't select this option, then the archiving process doesn't include time information and the Old option in the Paste Overview area of the Preferences dialog box isn't available.

  4. Click OK to download the archive file.

Move an Archived Object to a New Location

  1. Select the folder in the catalog where you want to upload the archived file.

  2. In the Tasks pane click Unarchive.

  3. In the Unarchive dialog box, browse for and select the archive file.

  4. Use the Replace option to specify whether to replace an existing folder or object with the same name.

    • All: Replace any existing folders or objects with the same names as folders or objects included in the archive file that you're uploading.

    • Old: Replace folders or objects except those folders or objects that exist, unless they're older than the source.

    • None: Add any new folders or objects, but preserve any existing folders or objects.

    • Force: Add and replace all folders or objects.

  5. Use the Access Control Lists option to specify how the folders or objects are assigned permissions using Access Control Lists Access when unarchived.

    • Inherit: Inherits the folder or object's permissions from its new parent folder.

    • Preserve: Preserves the folder or object's permissions as it was in the original, mapping accounts as necessary.

    • Create: Preserves the folder or object's permissions as it was in the original, creating and mapping accounts as necessary.

  6. Click OK.

Localize Catalog Captions

As an administrator, you can localize the names of reporting objects, or captions, that users create in the catalog. For example, you might localize the names of dashboards and analyses into Spanish and French.

To localize object names in the catalog, export the captions from the catalog, localize them, and upload back to the catalog.

  1. Export the default captions to an XML file.

  2. Localize the downloaded XML file.

  3. Upload the localized XML file.

After translated caption XML is uploaded, the new strings take effect once the cache refreshes. Translated strings are protected during an upgrade.

Export Captions from the Catalog

  1. Click Administration.

  2. Click Export Captions.

  3. Click Browse and select the folder you want to localize in the catalog, then click OK.

  4. Save the XML file.

Localize Your Captions

Translate the XML file into one or more languages.

Upload Localized Captions to the Catalog

After you have localized captions, upload the translated XML files, one for each language.

  1. Click Administration.

  2. Click Import Captions.

  3. Click Browse and select the localized XML file, then click OK.

  4. Use the Select Language option to select the language to which you have localized, then click OK.

  5. Save the XML file.

To download, review, or delete imported captions files, use the Manage Captions option.