Upload a Data File to an Environment and Run a Data Load Rule

Use these scripts to upload a file to an environment and then run a data rule to import data from the file into an application.

Prerequisites

  • The following definitions in Data Management:
    • A data load rule definition named VisionActual. It is assumed that the data rule does not specify a file path for the input file.
    • Period definitions Mar-15 through Jun-15
  • A properly formatted data file (GLActual.dat) that contains data.

To import data and run data load rule, you run commands that complete these steps:

  • Sign in to the environment.
  • Upload a file GLActual.dat that contains data for periods Mar-15 through Jun-15 into Data Management folder inbox/Vision.
  • Import data from GLActual.dat into Data Management using data load rule VisionActual, start period Mar-15, end period Jun-15, and import mode REPLACE.
  • Export data with the STORE_DATA option to merge the data in the Data Management staging table with existing application data.
  • Sign out.

Windows Sample Script

Create runDataLoadRule.ps1 by copying the following Script. Store it in a local directory.
$inputproperties = ConvertFrom-StringData(Get-Content ./input.properties -raw)
$username="$($inputproperties.username)"
$passwordfile="$($inputproperties.passwordfile)"
$serviceURL="$($inputproperties.serviceURL)"
$dataFile="$($inputproperties.dataFile)"
$dataRuleName="$($inputproperties.dataRuleName)"
$startPeriod="$($inputproperties.startPeriod)"
$endPeriod="$($inputproperties.endPeriod)"
$importMode="$($inputproperties.importMode)"
$exportMode="$($inputproperties.exportMode)"

epmautomate login ${username} ${passwordfile} ${serviceURL}
epmautomate uploadfile ${datafile} ${dataFileUploadLocation}
epmautomate rundatarule ${dataRuleName} ${startPeriod} ${endPeriod} ${importMode} ${exportMode} ${dataFileUploadLocation}/${dataFile}
epmautomate logout

Linux/UNIX Sample Script

Create runDataLoadRule.sh by copying the following Script. Store it in a local directory.
#!/bin/bash
. ./input.properties
export JAVA_HOME=${javahome}
${epmautomatescript} login "${username}" "${passwordfile}" "${serviceURL}"
${epmautomatescript} uploadfile "${datafile}" "${dataFileUploadLocation}"
${epmautomatescript} rundatarule "${dataRuleName}" "${startPeriod}" "${endPeriod}" "${importMode}" "${exportMode}" "${dataFileUploadLocation}/${dataFile}"
${epmautomatescript} logout

Creating the input.properties File

Create the input.properties file by copying one of the following and updating it with information for your environment. Save the file in the directory where runDataLoadRule.ps1 or runDataLoadRule.sh is stored.

Windows
username=serviceAdmin
passwordfile=./password.epw
serviceURL=https://example.oraclecloud.com
dataFile=GLActual.dat
dataFileUploadLocation=UPLOAD_LOCATION
dataRuleName=RULE_NAME
startPeriod=START_PERIOD
endPeriod=END_PERIOD
importMode=IMPORT_MODE
exportMode=EXPORT_MODE
Linux/UNIX
javahome=JAVA_HOME
epmautomatescript=EPM_AUTOMATE_LOCATION
username=exampleAdmin
passwordfile=examplePassword.epw
serviceURL=exampleURLdataFile=GLActual.dat
dataFileUploadLocation=UPLOAD_LOCATION
dataRuleName=RULE_NAME
startPeriod=START_PERIOD
endPeriod=END_PERIOD
importMode=IMPORT_MODE
exportMode=EXPORT_MODE

Table 3-21 input.properties Parameters

Parameter Description
javahome JAVA_HOME location. For Linux/UNIX only.
epmautomatescript Absolute path of EPM Automate executable (epmautomate.sh). For Linux/UNIX only.
username User name of a Service Administrator, who also has the Identity Domain Administrator role.
password Password of the Service Administrator or the name and location of the encrypted password file.
serviceURL URL of the environment from which you want to generate the snapshot.
dataFile The file that contains the data to be imported using the data rule.
dataFileUploadLocation Location to which the data file is to be uploaded.
dataRuleName Name of a data load rule defined in Data Integration.
startPeriod The first period for which data is to be loaded. This period name must be defined inData Integration period mapping.
endPeriod For multi-period data load, the last period for which data is to be loaded. For single period load, use the same period as start period. This period name must be defined in Data Integration period mapping.
importMode Mode for importing data into Data Integration. Use APPEND, REPLACE or RECALCULATE. Use NONE to skip data import into staging tables.
exportMode Mode for exporting data to the application. Use Data Integration. Use STORE_DATA, ADD_DATA, SUBTRACT_DATA or REPLACE_DATA. Use NONE to skip data export from Data Integration to the application.

Note:

Financial Consolidation and Close supports only MERGE and NONE modes.

Running the Script

  1. Create runDataLoadRule.ps1 or runDataLoadRule.sh by copying the script from a preceding section.
  2. Create the input.properties file and save it in the directory where the runDataLoadRule script is located. Contents of this file differs depending on your operating system. See Creating the input.properties File.

    Make sure that you have write privileges in this directory. For Windows, you may need to start PowerShell using the Run as Administrator option to be able to run the script.

  3. Launch the script.
    • Windows PowerShell: run runDataLoadRule.ps1.
    • Linux/UNIX: run ./runDataLoadRule.sh.