Customers Using Financial Consolidation and Close

For those Oracle Hyperion Financial Management customers moving to Financial Consolidation and Close, note the key differences between the two products:

  • Financial Consolidation and Close shows a positive amount as a debit and a negative amount as a credit.
  • A "plan type" is not a Financial Consolidation and Close concept.
  • Financial Consolidation and Close users can load Oracle General Ledger data into their own applications.

  • Users can export data from Financial Consolidation and Close to the Fusion GL as actuals.

  • Data synchronization can push data from either Planning or Financial Consolidation and Close to a Financial Consolidation and Close target application.

  • Financial Consolidation and Close can be used as a source system in the import format. In this way, you can use Financial Consolidation and Close as a source system, and then use a different cloud service (such as Planning Modules, Account Reconciliation, Planning, Profitability and Cost Management) as a target, and move data from Financial Consolidation and Close to these other cloud services.

    Additionally, you can pull data from Financial Consolidation and Close and push the data to a file for use in another application.

  • For a consolidation dimension, you can load different override amounts and rates for different source members by location. This enables you to report on details used to perform the various stages of the consolidation process.

  • In addition to the system predefined dimensions, you can create up to four additional Custom dimensions based on your application needs. Custom dimensions are associated with the Account dimension and provide additional detail for accounts. If Extended Dimensionality is enabled for the application, you can create up to four Custom dimensions. If the application is enabled with the multi-GAAP reporting option, you can create three Custom dimensions.

  • Data Integration supports a Financial Consolidation and Close "Period" as a column in a data file. If you have data for multiple periods in a single file, then you can include the year and period on each row of the data. In Map Dimensions, you select the source period rows of Year and Period, so the system knows that these columns are in the file, and then map them to the appropriate dimension in the target system. See Loading Multiple Periods for EPM Cloud or File-Based Source Systems.

  • Data Integration supports an explicit load method for loading journals to Financial Consolidation and Close. Journals are loaded by defining an integration with the type "Journals." Both Excel and text-based journal loads are supported. See Loading Journals to Financial Consolidation and Close in Administering Data Management for Oracle Enterprise Performance Management Cloud.
  • Drill through functionality is not supported for exchange rates data.
  • The import modes available to Financial Consolidation and Close are "append" and "replace."
  • For non-DSO applications (non Dense Sparse Optimization enabled application), when you import data from Financial Consolidation and Close and use an Explicit mapping set, do not use attribute columns ATTR2 and ATTR3 for any dimension mappings. Data Integration uses these columns to determine the correct period key for the row.
  • Financial Consolidation and Close customers can extract dynamic calculated values by selecting the All Data option in Direct Integration Options (for more information, see Defining Direct Integration Options). It is a Data Integration prerequisite that the CONTROL TO-DATE VIEW STORAGE setting in Financial Consolidation and Close is enabled, or the Financial Consolidation and Close application is DSO (Dense Sparse Optimization enabled application) based, to extract dynamic calculated values. For more information, see Using the Control To Date View Option.
  • The export modes available to Financial Consolidation and Close target application are:

    • Merge—If data already existed in the application, the system simply adds values from the load file to the existing data. No existing data is deleted. If data does not exist, new data will be created.
    • Replace —The system first deletes all values based on the scenario, year, period, entity, and data source before it submits the load.
    • Accumulate—Accumulate the data in the application with the data in the load file.
    • Dry Run—Scan a data load file for invalid records without loading data it to the target application. For each unique point of view in the data file, the value from the load file is added to the value in the application.
  • To load data to actual currency rather than entity currency when the currency is fixed, set the currency in the Functional Currency field in the Location option. You can also add a Currency row in the import format and map it.
  • Partial Data Loads—When loading data, all valid data will be loaded. For example, if some of the data does not pass any cell level validation rules, then that data will not be loaded, but all other valid data will be loaded. A partial load will be reflected as a failed integration, but all valid data will be loaded. If users are defined as an administrator, then cell level validations will be ignored, and the data will be loaded.

  • When running an integration across instances in push mode, the credentials from the connection details are used to determine the load user, and not the user that submitted the integration for processing. If you set up the integration in the reverse manner in "pull" mode, then the user executing the integration drives the security when loading to the consolidation application.

  • When loading as an administrator, the data load bypasses security including validation rules, and all data is loaded.

  • With validation rules are turned on and the Enable Data Security for Admin Users option is enabled when an administrative user loads data or for a non-administrator load, the load blocks data from loading to the cells where validations apply but loads the rest of the data. The behavior when security is turned on is to perform a partial load, but then show the integration rule as failed.

  • A cross instance data load uses the user defined in the connection, and also uses that user to determine the mode to load the data, NOT the user that is executing the rule.