You can debug and fix Transactional Business Intelligence slow-running queries that are not performing well. As discussed earlier, Transactional Business Intelligence is for real-time operations reporting and Oracle does not recommend running queries that do not fall into this category or to query very large volumes of historical data.
If the query is not performing well, check the filters first. Blind queries on very large tables can be difficult to tune. If querying large transaction tables, ensure that good time dimension filters restrict data to a few days, weeks, or months. Additionally, ensure that good filters on the dimensions being used to analyze the fact data exist. Add filters to restrict queries.
Next, examine the physical query in the Oracle BI Server query log. The query log file is named nqqueryn.log, where n is the date and timestamp. Issues encountered in Transactional Business Intelligence queries can be diagnosed in nquery.log.
You can enable query logging in two ways:
Follow the steps in the “Diagnosing and Resolving Issues in Oracle Business Intelligence” chapter in Oracle Fusion Middleware System Administrator's Guide for Oracle Business Intelligence Enterprise Edition.
Enable the log level to 2 in Log Level system variable in Oracle BI repository.
You may see additional filters being added by the application logic and security. Ask your database administrator to help tune the execution plan for this query to see which joins are causing performance issues. Avoid full table scans and hash table joins on large transaction tables. If you see these, attempt to add specific filters for these tables which use existing indexes.
You can also enable query caching to improve query performance. You can see dramatic improvement in query performance on cached queries. Caching Options in Transactional Business Intelligence explains how to enable caching for Transactional Business Intelligence.