Running Data Rules

Executes a Data Management data load rule based on the start period and end period, and import or export options that you specify.

Prerequisites

  • Data Rules: Data load rules define how Integrations load data from a file. You must have predefined data load rules to load data.

  • You must have the required privileges to execute a specific data rule.

REST Resource

POST /aif/rest/{api_version}/jobs

Request

Supported Media Types: application/json

Parameters

The following table summarizes the client request.

Table 10-5 Parameters

Name Description Type Required Default
api_version Version of the API you are working with, such as V1 Path Yes None
jobType should be set to "DATARULE"   Yes None
jobName The name of a data load rule defined in Data Management. You should enclose the rule name in quotation marks if it contains a space.   Yes None
startPeriod The first period for which data is to be loaded. This period name must be defined in Data Management period mapping.   Yes None
endPeriod The last period for which data is to be loaded. This period name must be defined in Data Management period mapping.   Yes None
importMode Determines how the data is imported into Data Management.

Acceptable values are:

  • APPEND to add to the existing POV data in Data Management

  • REPLACE to delete the POV data and replace it with the data from the file.

  • RECALCULATE to skip importing the data, but re-process the data with updated Mappings and Logic Accounts.

  • NONE to skip data import into Data Management staging table

  Yes None
exportMode Determines how the data is exported into Data Management.

Acceptable values for Planning Modules and Planning are:

  • STORE_DATA to merge the data in the Data Management staging table with the existing Planning data

  • ADD_DATA to add the data in the Data Management staging table to Planning

  • SUBTRACT_DATA to subtract the data in the Data Management staging table from existing Planning data

  • REPLACE_DATA to clear the POV data and replace it with data in the Data Management staging table. The data is cleared for Scenario, Version, Year, Period, and Entity

  • NONE to skip data export from Data Management to Planning

Acceptable values for Financial Consolidation and Close and Tax Reporting are:

  • REPLACE to delete the POV data and replace it with the data from the file
  • MERGE: By default, all data load is processed in the Merge mode. If data already existed in the application, the system overwrites the existing data with the new data from the load file. If data does not exist, the new data will be created.
  • NONE to skip the data export
  Yes None
fileName

An optional file name. If you do not specify a file name, this API imports the data contained in the file name specified in the load data rule. The data file must already reside in the Inbox prior to data rule execution.

Import data files from the EPM INBOX accessible from the Applications-Inbox/Outbox Explorer using a file name. Reference the files in this folder using #epminbox/<filename>.

  Yes None

Example URL

https://<SERVICE_NAME>-<TENANT_NAME>.<SERVICE_TYPE>.<dcX>.oraclecloud.com/aif/rest/V1/jobs

Example of Request Body

{"jobType":"DATARULE",
"jobName":"aso to bso dr",
"startPeriod":"Dec-18",
"endPeriod":"Dec-18",
"importMode":"REPLACE",
"exportMode":"NONE",
"fileName":"#epminbox/TestData.txt"
}

Response

Supported Media Types: application/json

Table 10-6 Parameters

Name Description

status

Status of the job: -1 = in progress; 0 = success; 1 = error; 2 = cancel pending; 3 = cancelled; 4 = invalid parameter

jobStatus

A text representation of the job status, with one of the following values" RUNNING", "SUCCESS". "FAILED"

jobId

The process ID generated in Data Management for the job

logFileName

Log File containing entries for this execution.

outputFileName

Name of the output file generated, if any.

processType

Type of the process executed. Will contain "COMM_LOAD_BALANCES" for all Data Rule executions

executedBy

Login name of the user used to execute the rule.

details

Returns the exception stack trace in case of an application error

Example of Response Body

The following shows an example of the response body in JSON format.

{
    "jobStatus": "RUNNING"
"jobId": 2019
"logFileName": "\outbox\logs\Account Reconciliation Manager_2019.log"
"outputFileName": null
"processType": "COMM_LOAD_BALANCES"
"executedBy": "admin"
"status": -1
"links": [1]
    0:  {
    "rel": "self"
    "href": "https://<SERVICE_NAME>-<TENANT_NAME>.<SERVICE_TYPE>.<dcX>.oraclecloud.com/aif/rest/V1/jobs/2019"
    "action": "GET"
}
"details": null

}