8Maintenance and Migration

This chapter contains the following:

Query Execution and Performance

When executing an analysis, database aggregation and processing is prioritized to minimize the data returned to the BI Server for further processing. When creating analyses, you should reinforce this principle to maximize performance.

Query execution has three phases.

  1. SQL compilation

  2. Database SQL execution

  3. Data set return and presentation in an analysis

Here is the query execution architecture.

Query process flow diagram describes query flow
from analysis creation, including the focus on database query processing.
Phase Step Description

1

1

You create an analysis, selecting columns from one or more subject areas, adding appropriate filters and graphs, and submit it for execution.

2

The BI Presentation Service receives the request and sends a logical SQL statement based on subject areas and columns to the BI Server.

3

The BI Server correlates logical SQL with View Objects and view links mapped in the metadata repository. It submits logical SQL to the WebLogic ADF layer, retrieving view object and view link SQL definitions to which it applies Applications Cloud data security. Finally, it applies aggregations to the view object query to assemble physical database SQL.

2

4

The database query, including the aggregations and data security, is executed in the Oracle Applications Cloud database.

5

The aggregated data set is returned to the BI Server, which merges the results and applies any additional calculations or filters.

3

6

The final data set is returned to the BI Presentation Service, where it's formatted for display and sent to the client.

Design Analyses to Enhance Performance

The way you design an analysis influences how the BI Server formulates the physical query, which determines how successfully processing is focused in the database to maximize performance. Other factors can also hurt performance and should be considered when you create ad-hoc analyses.

Here are a few key performance-enhancing design techniques.

  • Select only required subject areas and columns. The selection determines which view objects and database tables are used in the database query, and including any unnecessary tables increases query and join processing.

  • Add proper filters to use database indexes and reduce the result data set returned to the BI Server.

  • Remove unnecessary visual graphs.

  • Remove unused data columns. Any columns not used in visual graphs are included in the physical SQL execution, which may increase database query processing overhead.

Here are some of the factors that can hurt query performance and suggestions for improvement.

Factor Description Suggestions

Security

Analyses may perform well for a user with broad security grants, but worse for those with more restrictions.

Review and simplify your security. Minimize the number of roles and avoid the use of complex custom queries to enforce data security.

Data volumes

Analyses that query large volumes of data take longer. Some may query all records, but only return a few rows, requiring a lot of processing in the database or BI Server.

Consider adding filters in your analysis, especially on columns with database indexes in the database.

Cross-subject areas

Analyses including two or more subject areas can impact performance.

Review your analyses to see if you can remove some of the subject areas in the queries. Are all of the subject areas are required? As you built the analysis, did you notice a performance impact when you added a particular subject area?

Hierarchies

Hierarchies, particularly large ones, can impact performance.

Remove hierarchies to see if performance improves.

Sub-queries (NOT IN, IN)

Filtering on IN (or NOT IN) based on the results of another query means that the sub-query is executed for every row in the main query.

Replace NOT IN or IN queries with union set operators or logical SQL.

Number of attributes

Analyses often use a large number of columns that aren't required.

Reduce the number of columns in the criteria as much as possible.

Calculated measures

Calculating measures can query larger data volumes.

Use the predefined measures in your subject areas wherever possible.

Flexfields

Use of too many flexfields in analyses can hurt performance in some cases.

Remove flexfields to see if performance improves. Avoid flexfields in filters.

Filters

Reports with no filters or filters that allow a large volume of data may hurt performance.

Add filters and avoid applying functions or flexfields on filter columns. If possible, use indexed columns for filters.

Refer to OTBI subject area lineage for your offering, which documents indexed columns for each subject area.

How to Tune Performance for Analytics and Reports

When you create analyses and reports, don't use blind queries and include sufficient filters when creating analytics or reports with hierarchies.

Blind Queries

Avoid blind queries because they're run without filters and fetch large data sets. Performance can be an issue with these queries and can easily overload the application. All queries on large transactional tables must be time bound. For example, include a time dimension filter and additional filters to restrict by key dimensions, such as worker. Apply filters to columns that have database indexes in the transactional tables. This ensures a good execution plan is generated for the query from the BI Server.

Hierarchies and Trees

Queries on trees and hierarchical dimensions such as manager can have an impact on performance. The BI Server uses a column-flattening approach to quickly fetch data for a specific node in the hierarchy. But, because there is no pre-aggregation for the different levels of the hierarchy, it's important to carefully craft any query involving hierarchies to ensure that enough filters are applied to keep the result set small.

Review SQL Statements Used in Analyses

You can review logical and physical SQL statements using either of the following procedures.

Logical and Physical SQL

Logical SQL is non-source specific SQL that's issued to the Oracle BI Server for an analysis. Logical queries use column names from the subject areas in the Presentation Layer in the repository (RPD) metadata. Based on the logical request, the BI Server issues optimized source-specific SQL to the actual data sources in the Physical Layer of the metadata. If you have administrative privileges, you can review both logical and physical SQL for analyses.

Edit Mode

  1. Open the analysis in Edit mode and click the Advanced tab.

  2. In the SQL Issued section, review the logical SQL statement.

Administration Page

  1. On the Administration page, in the Session Management section, click the Manage Sessions link.

    Note: You must be a Business Intelligence Administrator to access the Administration and Manage Sessions page.
  2. On the Manage Sessions page, in the Action column, click the View Log link to review the SQL statement.

Localize Catalog Captions

As an administrator, you can localize the names of reporting objects, or captions, that users create in the catalog. For example, you might localize the names of dashboards and analyses into Spanish and French.

To localize object names in the catalog, export the captions from the catalog, localize them, and upload back to the catalog.

  1. Export the default captions to an XML file.

  2. Localize the downloaded XML file.

  3. Upload the localized XML file.

After translated caption XML is uploaded, the new strings take effect once the cache refreshes. Translated strings are protected during an upgrade.

Export Captions from the Catalog

  1. Click Administration.

  2. Click Export Captions.

  3. Click Browse and select the folder you want to localize in the catalog, then click OK.

  4. Save the XML file.

Localize Your Captions

Translate the XML file into one or more languages.

Upload Localized Captions to the Catalog

After you have localized captions, upload the translated XML files, one for each language.

  1. Click Administration.

  2. Click Import Captions.

  3. Click Browse and select the localized XML file, then click OK.

  4. Use the Select Language option to select the language to which you have localized, then click OK.

  5. Save the XML file.

To download, review, or delete imported captions files, use the Manage Captions option.

Archive and Move Analytics

You can archive your analytic objects and move them to a new location. You can bundle the whole catalog, specific folders, or multi-component objects as a .catalog file and upload the file to unarchive its data in another location.

Note: It's important to note that folders that contain seeded content should not be changed. Moving or adding a child folder to a seeded folder results in analytics not being visible in dashboards or infolets.

Create an Archive

  1. Locate the object in the catalog.

  2. Select More and then select Archive.

  3. In the Archive dialog box, select one or more of the following options:

    • Keep Permissions: Maintain the object or folder's existing permissions. If you don't select this, the archiving process doesn't include any permissions. Once unarchiving, the parent folder's permissions are assigned to all of the objects and folders.

    • Keep Time stamps: Maintain the Creation Time, Last Modified, and Last Accessed times assigned to the object or folder. Upon unarchiving, the Last Modified time is updated to indicate the time when the object or folder is unarchived. If you select this option, the Old option in the Paste Overview area of the Preferences dialog box is available when unarchiving. Use the Old option to overwrite existing catalog items older than the items in the archive.

      If you don't select this option, then the archiving process doesn't include time information and the Old option in the Paste Overview area of the Preferences dialog box isn't available.

  4. Click OK to download the archive file.

Move an Archived Object to a New Location

  1. Select the folder in the catalog where you want to upload the archived file.

  2. In the Tasks pane click Unarchive.

  3. In the Unarchive dialog box, browse for and select the archive file.

  4. Use the Replace option to specify whether to replace an existing folder or object with the same name.

    • All: Replace any existing folders or objects with the same names as folders or objects included in the archive file that you're uploading.

    • Old: Replace folders or objects except those folders or objects that exist, unless they're older than the source.

    • None: Add any new folders or objects, but preserve any existing folders or objects.

    • Force: Add and replace all folders or objects.

  5. Use the Access Control Lists option to specify how the folders or objects are assigned permissions using Access Control Lists Access when unarchived.

    • Inherit: Inherits the folder or object's permissions from its new parent folder.

    • Preserve: Preserves the folder or object's permissions as it was in the original, mapping accounts as necessary.

    • Create: Preserves the folder or object's permissions as it was in the original, creating and mapping accounts as necessary.

  6. Click OK.

Row Limits in Analyses

There's a limit of 65,000 records for analyses. When you run an analysis that contains more rows than that, the results are limited to 65,000 rows. Exports to Excel are limited to 25,000 rows and 50,000 cells.

It's recommended that you use BI Publisher reports to export large data sets.

About Warehouse Column Errors in Transactional Analyses

In some analyses and dashboards, you may encounter errors on some columns in your analyses in cases where non-transactional data sources are being queried.

Some subject areas and analyses in the catalog for offerings including Oracle Customer Experience include analyses that query subject area columns mapped to data warehouse data used for historical analysis. In cases where historical warehouse data is included in transactional analyses, you may encounter errors if the warehouse is not set up and configured.

In some cases, but not all, these conditions are managed by repository (RPD) metadata that supports switching from warehouse to transactional sources using variables.

BI Publisher

SQL Statements Supported by Data Models

BI Publisher data models support both standard and non-standard SQL.

Non-Standard SQL supports Data Manipulation Language (DML) statements.

  • Database procedures

  • Cursor statements that return nested results sets

  • Reference cursor functions that don't use a parameter for a SELECT statement, table or column names.

  • Conditional queries that execute multiple SQL queries in a single data set using an if-else expression.

Specifically, SQL statements with the following key words are not supported: DELETE, INSERT, UPDATE, DROP, EXECUTE, ALTER , MERGE, DBMS_SQL, CREATE, AUTONOMOUS_TRANSACTION, PRAGMA, WRITETEXT, UPDATETEXT, LOAD, UNLOAD, DATABASE, GRANT, REVOKE, TRUNCATE.

Apply Preferences

If you're logged in to BI Publisher and make changes to preferences, for example time zone and locale, you can immediately apply the changes in your BI Publisher session.

  • Append the &relogon=1 parameter to the end of the page URL and re-load the page.

  • Log out of BI Publisher and log in again.