DATAEXPORT

The DATAEXPORT calculation command writes data from an Essbase cube to a text or binary file.

Syntax

For a text output file:

DATAEXPORT "File" "delimiter" "fileName" "missingChar"

For a binary output file (DATAEXPORT to binary files is not supported across Essbase releases, and is only supported between 64-bit operating systems):

DATAEXPORT "Binfile" "fileName"

Parameters

"File" | "Binfile"

Required keyword for the type of output file. Specify the appropriate keyword, then use the associated syntax.

"delimiter"

Required for "File" exports

The character that separates fields; for example, ","

Do not use with "Binfile" exports

"fileName"

Required for "File" and "Binfile" exports

File name for the export file.

"missingChar"

Optional for output type "File"

  • A text string to represent missing data values. Maximum length: 128 characters.

  • "NULL" to skip the field, resulting in consecutive delimiters (such as ,,).

  • Default value: #MI

Do not use with "Binfile" exports.

Notes

  • In general, specify SET commands within the calculation script to specify various options, and then use FIX…ENDFIX to refine data to be exported, including the DATAEXPORT command within the FIX…ENDFIX command set. Without FIX…ENDFIX, the entire cube is exported.

  • Exported data from the cube is written to the <Application Directory>/app/appname/dbname directory, unless the administrator has specified a different export location using FILEGOVPATH. If you do not know where <Application Directory> is in your environment, see Environment Locations in the Essbase Platform for an explanation.

    Export files cannot be written to a client.

    If the specified export file already exists, the export will fail. This is a safeguard against overwriting existing export files.

  • To use this command with parallel calculation, use FIXPARALLEL...ENDFIXPARALLEL instead of SET CALCPARALLEL.

  • Use the DATAIMPORTBIN command to import a previously exported binary export file.

  • Calculation export locks one block at a time; all other blocks can be updated.

Description

The DATAEXPORT calculation command writes data into a text or binary output file.

Whereas the MaxL Export Data statement can export all, level 0, or input data from the entire cube as text data, the DATAEXPORT calculation command also enables you to:

Example

Text Output File: Level 0 Slice

SET DATAEXPORTOPTIONS
  {
  DataExportLevel "LEVEL0";
  };
DATAEXPORTCOND ("Sales">=1000);
FIX ("100-10","New York","Actual","Sales");
DATAEXPORT "File" "," "jan.txt" "#MI";
ENDFIX;

Specifies a level 0 data export level, limits output to data only with 1000 or greater Sales, fixes the data slice, then exports to a text file located in the cube directory, using comma (,) delimiters and specifying #MI for missing data values.

Text Output File: CSV Format for Federated Partition

The following examples export data in the required format for the DBMS_CLOUD package (applicable when you use a federated partition to integrate your cube with Autonomous Data Warehouse).

Example 1

By specifying the pivot dimension of the fact table as the DataExportColHeader, you can generate a data file that is in the format required for DBMS_CLOUD. In this calc script example, the pivot dimension is Measures.

SET DATAEXPORTOPTIONS
  {
  DataExportColHeader "Measures";
  DataExportOverwriteFile ON;
  DATAEXPORTDIMHEADER ON;
  DATAEXPORTCSVFORMAT ON;
  };
  DATAEXPORT "File" "," "out.txt" "NULL";

The example above generates a data file that is in the format required for DBMS_CLOUD, as shown in the following output example. The fact table column names are in the header, and the fields of each data record are comma separated.

"Year","Product","Market","Scenario","Sales","COGS","Margin","Marketing","Payroll","Misc","Total Expenses","Profit","Opening Inventory","Additions","Ending Inventory"
"Jan","100-10","New York","Actual",678,271,407,94,51,0,145,262,2101,644,2067
"Feb","100-10","New York","Actual",645,258,387,90,51,1,142,245,2067,619,2041
"Mar","100-10","New York","Actual",675,270,405,94,51,1,146,259,2041,742,2108

Example 2

In this calc script example, the pivot dimension is Year.

SET DATAEXPORTOPTIONS
  {
  DataExportColHeader "Year";
  DataExportOverwriteFile ON;
  DATAEXPORTDIMHEADER ON;
  DATAEXPORTCSVFORMAT ON;
  };
  DATAEXPORT "File" "," "out.txt" "NULL";

The example above generates a data file that is in the format required for DBMS_CLOUD, as shown in the following output example. The fact table column names are in the header, and the fields of each data record are comma separated.

"Product","Market","Scenario","Measures","Jan","Feb","Mar","Apr","May","Jun","Jul","Aug","Sep","Oct","Nov","Dec"
"100-10","Utah","Actual","Additions",1762.0,1681.0,1482.0,1201.0,1193.0,1779.0,2055.0,1438.0,1991.0,1443.0,1379.0,1415.0
"100-10","Utah","Actual","COGS",598.0,714.0,630.0,510.0,510.0,756.0,697.0,533.0,624.0,535.0,536.0,600.0
"100-10","Utah","Actual","Ending Inventory",5417.0,8379.0,10294.0,4125.0,3823.0,7861.0,7283.0,4636.0,6383.0,9805.0,9247.0,8757.0
"100-10","Utah","Actual","Margin",383.0,449.0,363.0,389.0,431.0,445.0,458.0,394.0,401.0,350.0,353.0,437.0
"100-10","Utah","Actual","Marketing",201.0,240.0,210.0,171.0,171.0,254.0,235.0,178.0,209.0,180.0,180.0,202.0
"100-10","Utah","Actual","Misc",2.0,1.0,3.0,1.0,1.0,0.0,0.0,3.0,2.0,2.0,1.0,0.0
"100-10","Utah","Actual","Opening Inventory",4636.0,7861.0,9805.0,3823.0,3571.0,7283.0,6383.0,4125.0,5417.0,9247.0,8757.0,8379.0
"100-10","Utah","Actual","Payroll",121.0,116.0,116.0,116.0,116.0,116.0,121.0,116.0,121.0,116.0,116.0,116.0
"100-10","Utah","Actual","Profit",59.0,92.0,34.0,101.0,143.0,75.0,102.0,97.0,69.0,52.0,56.0,119.0

Binary Example 1: Export

SET DATAEXPORTOPTIONS 
  { 
  DataExportLevel "ALL";
  };
FIX ("New York");
DATAEXPORT "BinFile" "newyork.bin";
ENDFIX;

Exports all New York blocks. Binary exports can be fixed only on sparse dimensions. Essbase uses the same bitmap compression technique to create the file as is used by Essbase Kernel.

Binary Example 2: Import

SET DATAIMPORTIGNORETIMESTAMP OFF;
DATAIMPORTBIN "newyork.bin"

Imports the previously exported file. The timestamp must match. The data is imported to the cube on which the calculation script is executed. Because only data was exported, to recreate a cube after using DATAIMPORT to read in the data, you must recalculate the data.