Business Intelligence Cloud Connector

Oracle Business Intelligence Cloud Connector (BICC) is the best integration option to use when exporting bulk data from Oracle Fusion Cloud Applications for downstream integration with data warehouses or other third-party applications.

Oracle Fusion Applications provide optimized business objects for data extractions, packaged as offerings that customers can extract in an automated fashion. For more information, see Creating a Business Intelligence Cloud Extract.

Key Features

  • Used for outbound flow only.
  • Provides previously built data extracts, called offerings, for Oracle Fusion Cloud Procurement applications. Each offering has associated business objects that are all extracted together. Review each offering for which you want to run an extract and configure if necessary.
  • Allows customers to create custom offerings, adjust business objects in offerings, and even select the fields they're interested in. For the best experience, extract only the objects and fields necessary for your third-party integration.
  • Valid for initial data extract when setting up third-party integration with Oracle Fusion Applications or for ongoing incremental data extracts.
  • An ideal way to export voluminous data out of Oracle Procurement.
  • Supports both automated and manual data export processes, providing flexibility to users based on their specific requirements and IT infrastructure.
  • Can be configured to write the extracted data files to Oracle Universal Content Management (UCM) or Oracle Cloud Infrastructure (OCI) Object Storage.

Best Practices

  • Jobs allow you to extract data from Oracle Fusion Applications to support multiple downstream integrations. Different jobs can be used for different requirements and run on whatever schedule is needed, including running different jobs with the same data stores running at the same time. It's a best practice to create and configure a job for your extraction.
  • For same downstream requirement, it's also advisable to create multiple jobs. For example, decouple heavy view-object (VO) extracts into separate jobs. Having them included into common jobs could result in them running late in the cycle and extending the extract window. Multiple jobs can run in parallel.
  • Jobs allow users to define priority groups and priority numbers within a job. Understanding job configuration and priorities management is essential to achieve the maximum extract orchestration and better performance.
  • If you configure both data and primary key extracts, then create two separate jobs, one for data extract and the other one for primary keys. If you keep them in a single job, then BICC would first do the data extract and pause primary-key extracts until the very last data extract completion.
  • Use the entity-specific ExtractPVOs for extracting data using BICC. Those public view objects (PVO) are designed for maximum efficiency of extracts. Other PVOs including Oracle Transactional Business Intelligence (OTBI) reporting PVOs, are available in BICC, but they can cause performance problems if used for integration purposes.
  • Audit the list of extract attributes for every single VO and check the bare minimum of the extract columns to address your data integration business requirements.
    Note: Important! By default, all the columns get extracted. You should select only the columns needed as per your use case and don't extract all unless you really need them.
  • BICC has the default extract timeout of 10 hours per VO extract. Some large-volume VOs might require more than 10 hours to process initial volumes. You can overwrite the default value to accommodate your initial extract completion in Oracle BI Applications Configuration Manager by going to Manage Offerings and Data Stores > Actions > Job Setting > Extract preference > Timeout in Hours: 10 Hours (default).
  • Plan to run your initial BICC extract jobs outside of normal business hours. Some initial extracts might require larger TEMP and UNDO tablespace to minimize the chance of running out of space during less busier times such as weekends.
  • Apply filters to your extraction queries to ensure that only relevant data is retrieved. Not only does this speed up the extraction process, it also reduces the volume of unwanted data and makes the next processing and analysis more efficient.
  • Ensure data dependencies across objects are maintained by setting a prune time. This identifies from which extract date to include incremental data, ensuring data consistency and completeness.
  • Use broker mode to enhance the performance of data fetching. Broker mode helps parallel processing and efficient data transfer, which significantly improves the speed and reliability of large data extractions.
  • Implement a regular purging mechanism to delete downloaded files from UCM. This practice helps in managing storage space effectively and prevents the accumulation of obsolete data files, thereby optimizing storage costs and maintaining a clean data environment. You can use the BICC Delete Expired UCM Files job for this action.

Constraints

  • Not suitable for real-time data extraction.
  • Frequent and concurrent large data extracts can impact system performance, so it's important to consult each Oracle Procurement application guide for information about how to manage these situations.
  • BICC itself doesn't support scheduling jobs more often (for example, every 5 minutes) directly through its native scheduling capabilities.
  • Typically, Flex VOs are generated dynamically. Therefore, marking columns for extracts explicitly wouldn't work for Oracle Business Intelligence broker mode. These VOs should continue using Oracle Business Intelligence server mode.
Note: Using Oracle Analytics Publisher to extract data from Oracle Fusion Applications is an unsupported pattern and should not be used by customers. If alternatives such as BICC or REST APIs can't accomplish your use case and you need to use Oracle Analytics Publisher for extraction, we strongly recommend that you create a custom Oracle Enterprise Scheduler (ESS) job of type BIPJobType and use it to schedule and run your report. You can use the downloadEssJobExecutionDetails (synchronous) or exportBulkData (asynchronous) ERP Integration Service operations to fetch the generated report content.