Running Data Rules
Executes a Data Management data load rule based on the start period and end period, and import or export options that you specify.
Prerequisites
-
Data Rules: Data load rules define how Integrations load data from a file. You must have predefined data load rules to load data.
-
You must have the required privileges to execute a specific data rule.
REST Resource
POST /aif/rest/{api_version}/jobs
Request
Supported Media Types: application/json
Parameters
The following table summarizes the client request.
Table 9-5 Parameters
Name | Description | Type | Required | Default |
---|---|---|---|---|
api_version |
Version of the API you are working with, such as V1 | Path | Yes | None |
jobType |
should be set to "DATARULE" | Yes | None | |
jobName |
The name of a data load rule defined in Data Management. You should enclose the rule name in quotation marks if it contains a space. | Yes | None | |
startPeriod |
The first period for which data is to be loaded. This period name must be defined in Data Management period mapping. | Yes | None | |
endPeriod |
The last period for which data is to be loaded. This period name must be defined in Data Management period mapping. | Yes | None | |
importMode |
Determines how the data is imported into Data Management.
Acceptable values are:
|
Yes | None | |
exportMode |
Determines how the data is exported into Data Management.
Acceptable values for Oracle Enterprise Planning and Budgeting Cloud and Oracle Planning and Budgeting Cloud are:
Acceptable values for Oracle Financial Consolidation and Close Cloud and Oracle Tax Reporting Cloud are:
|
Yes | None | |
fileName |
An optional file name. If you do not specify a file name, this API imports the data contained in the file name specified in the load data rule. The data file must already reside in the INBOX prior to data rule execution. | Yes | None |
Example URL
https://<SERVICE_NAME>-<TENANT_NAME>.<SERVICE_TYPE>.<dcX>.oraclecloud.com/aif/rest/V1/jobs
Example of Request Body
{"jobType":"DATARULE",
"jobName":"aso to bso dr",
"startPeriod":"Dec-18",
"endPeriod":"Dec-18",
"importMode":"REPLACE",
"exportMode":"NONE",
"fileName":""
}
Response
Supported Media Types: application/json
Table 9-6 Parameters
Name | Description |
---|---|
|
Status of the job: -1 = in progress; 0 = success; 1 = error; 2 = cancel pending; 3 = cancelled; 4 = invalid parameter |
jobStatus |
A text representation of the job status, with one of the following values “RUNNING”, “SUCCESS”. “FAILED” |
jobId |
The process ID generated in Data Management for the job |
logFileName |
Log File containing entries for this execution. |
outputFileName |
Name of the output file generated, if any. |
processType |
Type of the process executed. Will contain “COMM_LOAD_BALANCES” for all Data Rule executions |
executedBy |
Login name of the user used to execute the rule. |
details |
Returns the exception stack trace in case of an application error |
Example of Response Body
The following shows an example of the response body in JSON format.
{
"jobStatus": "RUNNING"
"jobId": 2019
"logFileName": "\outbox\logs\Account Reconciliation Manager_2019.log"
"outputFileName": null
"processType": "COMM_LOAD_BALANCES"
"executedBy": "admin"
"status": -1
"links": [1]
0: {
"rel": "self"
"href": "https://<SERVICE_NAME>-<TENANT_NAME>.<SERVICE_TYPE>.<dcX>.oraclecloud.com/aif/rest/V1/jobs/2019"
"action": "GET"
}
"details": null
}
Getting API Versions of Data Management APIs Sample Code
Java Sample prerequisites: java-json.jar ,jersey-core.jar, jersey-client.jar, jersey-json.jar
Common functions: See Common Helper Functions for Java
Java Sample – RulesResource.java
/*
File: RulesResource.java - Created on Apr 24, 2015
Copyright (c) 2015 Oracle Corporation. All Rights Reserved.
This software is the confidential and proprietary information of Oracle.
*/
@POST
@Path("/jobs")
@Produces(MediaType.APPLICATION_JSON)
@Consumes(MediaType.APPLICATION_JSON)
public Object executeJob(String jsonPayload, @HeaderParam("ssoToken") String ssoToken, @HeaderParam("userName") String userName){
Response.Status httpStatus = Response.Status.OK;
BaseDTO response = null;
RulesRequestDO requestDo = null;
try{
ObjectMapper mapper = new ObjectMapper();
requestDo = mapper.readValue(jsonPayload, RulesRequestDO.class);
}catch (Throwable t){
ERPIModelLogger.logError("Error encountered in Parsing JSON payload:",null,t);
httpStatus = Response.Status.BAD_REQUEST;
response = RestResourceUtil.getCustomErrorResponse("EPMFDM-ERROR: Error in parsing JSON Payload",response, StatusCode.INVALID_INPUT);
}
if (requestDo != null && RestResourceUtil.isBlankString(requestDo.getJobType())){
httpStatus = Response.Status.BAD_REQUEST;
response = RestResourceUtil.getCustomErrorResponse("EPMFDM-ERROR: No Job Type Specified", response, StatusCode.INVALID_INPUT);
}else if (requestDo != null &&
!(requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.DATARULE) || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.BATCH)
|| requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.DIRECTIMPORT) || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.DIRECTEXPORT)
|| requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.EXPORTTOARM) || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.REFRESH_APP_DIMENSIONS)
|| requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.REFRESH_APP_METADATA) || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.REPORT)
|| requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.MAPPINGIMPORT) || requestDo.getJobType().equalsIgnoreCase(RestResourceConstants.MAPPINGEXPORT))){
httpStatus = Response.Status.BAD_REQUEST;
response = RestResourceUtil.getCustomErrorResponse("EPMFDM-ERROR: Invalid Job Type Specified", response, StatusCode.INVALID_PARAMETER);
}
if (requestDo == null){
requestDo = new RulesRequestDO();
}
ERPIModelLogger.logInfo("Executing Job with options: "+requestDo.toString(), null);
if ( RestResourceConstants.DATARULE.equalsIgnoreCase(requestDo.getJobType()) ){
try{
response = validateDataRuleParams(requestDo);
if (response.getStatus() == 0){
if (RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getImportMode())){
requestDo.setImportMode(null);
}
if (RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getExportMode())){
requestDo.setExportMode(null);
}
response = getRulesServiceV1().executeDataRule(ssoToken, requestDo.getJobName(), requestDo.getStartPeriod(),
requestDo.getEndPeriod(), requestDo.getImportMode(), requestDo.getExportMode(),
requestDo.getFileName(),getBaseURL());
}else{
httpStatus = Response.Status.BAD_REQUEST;
}
}catch (Throwable t){
ERPIModelLogger.logError("Error encountered in Execute Rule Service:",null,t);
httpStatus = Response.Status.INTERNAL_SERVER_ERROR;
response = RestResourceUtil.getHttpErrorResponse(t,response);
}
}else if ( RestResourceConstants.BATCH.equalsIgnoreCase(requestDo.getJobType()) ){
try{
if (RestResourceUtil.isBlankString(requestDo.getJobName())){
httpStatus = Response.Status.BAD_REQUEST;
response = RestResourceUtil.getCustomErrorResponse("Invalid or missing parameter: Job Name", response, StatusCode.INVALID_PARAMETER);
}else{
response = getRulesServiceV1().executeBatch(requestDo.getJobName(), ssoToken, getBaseURL());
}
}
catch (Throwable t){
ERPIModelLogger.logError("Error encountered in Fetch Batch Definition Service:",null,t);
httpStatus = Response.Status.INTERNAL_SERVER_ERROR;
response = RestResourceUtil.getHttpErrorResponse(t,response);
}
return Response.status(httpStatus.getStatusCode()).header(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON + "; charset=UTF-8").entity(response).build();
}
Java Sample - Run Data Rule Validator, RulesResource.java
private BaseDTO validateJobParams(RulesRequestDO requestDo){
BaseDTO response = new BaseDTO();
boolean isError = false;
String errorMessage = "Invalid or missing parameter: ";
if (RestResourceUtil.isBlankString(requestDo.getJobName())){
errorMessage = errorMessage + "Job Name,";
isError = true;
}
if (RestResourceUtil.isBlankString(requestDo.getStartPeriod())){
errorMessage = errorMessage + "Start Period,";
isError = true;
}
if (RestResourceUtil.isBlankString(requestDo.getEndPeriod())){
errorMessage = errorMessage + "End Period,";
isError = true;
}
if ((RestResourceUtil.isBlankString(requestDo.getImportMode()) || RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getImportMode()) ) &&
(RestResourceUtil.isBlankString(requestDo.getExportMode()) || RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getExportMode())) ){
errorMessage = errorMessage + "Import/Export mode,";
isError = true;
}else {
if (!RestResourceUtil.isBlankString(requestDo.getImportMode()) &&
!("APPEND".equalsIgnoreCase(requestDo.getImportMode()) || "REPLACE".equalsIgnoreCase(requestDo.getImportMode()) ||
"RECALCULATE".equalsIgnoreCase(requestDo.getImportMode()) || RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getImportMode()) )){
errorMessage = errorMessage + "Import mode,";
isError = true;
}
if (!RestResourceUtil.isBlankString(requestDo.getExportMode()) &&
!("STORE_DATA".equalsIgnoreCase(requestDo.getExportMode()) || "ADD_DATA".equalsIgnoreCase(requestDo.getExportMode())
|| "SUBTRACT_DATA".equalsIgnoreCase(requestDo.getExportMode()) || "REPLACE_DATA".equalsIgnoreCase(requestDo.getExportMode())
|| "REPLACE".equalsIgnoreCase(requestDo.getExportMode()) || "MERGE".equalsIgnoreCase(requestDo.getExportMode())
|| RestResourceConstants.NONE.equalsIgnoreCase(requestDo.getExportMode()) )){
errorMessage = errorMessage + "Export mode,";
isError = true;
}
}
if (isError){
errorMessage = errorMessage.substring(0, errorMessage.length()-1);
response = RestResourceUtil.getCustomErrorResponse(errorMessage, response, StatusCode.INVALID_PARAMETER);
}
return response;
}