10Import Data Using External Data Loader Client

This chapter contains the following:

Overview of External Data Loader Client

This topic provides an overview of the External Data Loader Client.

The External Data Loader Client is a command-line tool that breaks a large data file into smaller importable data files. This improves the performance of large-volume imports. You can also queue the multiple file import activities required to import a large data set, in a single action instead of manually or programmatically triggering multiple import activities.

How You Download and Install the External Data Loader Client

This topic shows how to download and install the External Data Loader Client. Here are the steps involved in this process:

  1. Download and Install the Client

  2. Set up Credentials

Before You Start

Ensure that your computer has Java version 1.8 or higher installed.

Ensure that your Operating System is one of these:

  • Microsoft Windows 7

  • Microsoft Windows 8

  • Microsoft Windows 10

Ensure that your Operating System is one of these.

Required Privileges

To download the client, you must have access to My Oracle Support (support.oracle.com). To use the tool to import data into CX Sales and B2B Service you must be able to access REST services and import data for the objects.

Prerequisite Steps

The Data Loader client doesn't require any specific configuration within CX Sales and B2B Service. However, you must create an import map before using it. The client passes a mapping number value when submitting an import request. You can't create a new import mapping from the Data Loader Client application.

Download and Install the Client

You can download the client from My Oracle Support (Doc ID 2325249.1).

The download contains a ZIP file, with sub-folders. Save the ZIP file to a directory in which the full file path contains no white spaces. You must extract all of the files to a folder on your computer.

Note: When executing the client from the command line the application uses the name oscdataloader.jar. For example,
> java -jar oscdataloader.jar -h

Set up Credentials

You can install and use the client for automated imports while hiding the credentials that are used with the help of the Credential Store. The Credential Wallet holds one set of user credentials (user name and password pair). To save your credentials use the savepassword command,

> java -jar oscdataloader.jar savepassword -username John.Doe

Enter your password:

>

You are prompted to enter the password for the specified user name value. This password is masked so that it can't be viewed in the command window. You can then use the tool to submit or retrieve the status or log files for an import by only providing the user name value. This protects the password of the integration user by preventing other users from viewing the password either on the command line (by scrolling back through recently executed commands, viewing the password in the scheduling utility, or opening a configuration file containing the password value.

You can get the options for each command, by using the option -h for that command. For example, the following command lists the options available for the submit command.

> java -jar oscdataloader.jar submit -h

Use the Configuration File

The default configuration file is oscdataloader.config located in the config folder. This is used for all imports, unless otherwise mentioned. You can use another configuration file using the -config parameter.

> java -jar oscdataloader.jar listimports -config c:\test\configfile.txt

The configuration file should have all parameters, with each line of the file containing a key-value pair. For example:

import-name=Testing

input-file=c:\\testing\\testing.csv

mapping-number=888100111272708

target-url=https://mypod.oracle.com

object=Account

username=john.doe

How You Use the External Data Loader Client

This topic describes how to use the External Data Loader Client tool.

You can use the tool to split a source file into multiple smaller files for upload and then import. This lets you import large files that exceed the import size limits. You can split a CSV file in one of two ways:

  • Independently - the specified CSV file can be split based on selected parameters and no import is performed

  • As part of import - the specified CSV file is split and then each of the resulting files is uploaded and then imported

The following steps are involved in using the tool:

  • Submit Import Jobs

  • Review Status of Import Jobs

  • Retrieve Log Files

Submit Import Jobs

The submit operation, takes the source single CSV file, splits the file into smaller files (if necessary), uploads those files to the Webcenter Content Repository and then imports the files. When executing the client from the command line, you must provide the URL of the pod to run the import, the user credentials for authentication, the import object name, the import map number, and the import source file name.

> java -jar oscdataloader.jar submit -target-url https://mypod.oracle.com -username importuser -input-file Account.csv -import-name "Account Import" -object Account -mapping-number 123565

If multiple import activities are created due to splitting of files, then import activity name values are the same. The file name values are suffixed with a count value. For example, if the source file Account.csv is split into two files, then file name values are called Account_1.csv and Account_2.csv.

You can retrieve the list of imports that have recently run on a server using the listimports command. For example,

> java -jar oscdataloader.jar listimports -target-url https://mypod.oracle.com -username importuser

lists the imports as a table in the command window. An example of the output would contain the following information:

Identifier Name Object Attachment Object Start Time End Time Status Submitted By Activate

0123456789012345

Account Import 20170410

Account

N/A

20170410 14:56T05:00

20170410 16:56T05:00

Completed

John.Doe

Yes

7878787878787878

Opportunity Sync

Opportunity

N/A

20170411 17:06T05:00

20170411 20:06T05:00

Completed with errors

John.Doe

Yes

Note: You should place the import source files in local folders while using EDLC. Remotely mounted files might cause performance issues.

Review Status of Import Jobs

To get the status of an import activity, enter the identifier value of the import on the command line. For example:> java -jar oscdataloader.jar getlogs -target-url https://mypod.oracle.com -username importuser -log-type Exception -import-activity 012345678912345

This command displays the detailed status of the matching import request (if no matching request is found, then an error is displayed). An example of the output contains the following information:

Object Attachment Object Status Start Time End Time Number of File Splits Records Read Successfully Loaded Errors Reported Warnings Reported

Account

N/A

Completed

20170410 14:56T05:00

20170410 17:36T05:00

2

50,000

49,950

50

0

Retrieve Log Files

You can get the log files generated by an import activity using the getlogs command. For example,

> java -jar oscdataloader.jar getlogs -target-url https://mypod.oracle.com -username importuser -log-type Exception -import-activity  012345678912345

Log files retrieved by the getlogs command are stored in the importlogs folder by default. Log file of the External Data Loader Client is stored in the oscdataloaderlogs folder.

Import Job Statuses

The import job can have one of these statuses:

  • Completed - all batches have completed.

  • Completed with Errors - one or more batches have completed with errors.

  • In Progress - one or more batches are currently in progress.

  • Unsuccessful - one or more batches are unsuccessful.

  • Completed with Warnings - one or more batches have completed with warnings.

Command-Line Parameters

This topic describes the available command-line parameters for External Data Loader Client.

You can use the help option to see a list of all the commands that are available. For example, the following command lists all the command-line options.

> java -jar oscdataloader.jar help

You can get the options for each command, by using the option -h for that command. For example, the following command lists the options available for the submit command.

> java -jar oscdataloader.jar submit -h

If you're using a proxy server while importing using the EDLC, then you must specify the proxy parameters in the command-line call. The following example shows how:

java -Dhttps.proxyHost=host_name -Dhttps.proxyPort=80 -jar
oscdataloader.jar listobjects -target-url https://<your_pod> -u mhoope -w

Here are the command-line parameters.

Command and Options (Preceded by Hyphen) Description

config

The path where the configuration file is located.

-desc

The description of the import job submitted.

-delimiter

The delimiter used in your source file. The default value is COMMA_DELIMITER. Possible values are:

  • COMMA_DELIMITER: Indicates the comma delimiter

  • OTHER_CARET: Indicates the caret symbol delimiter

  • OTHER_COLON: Indicates the colon delimiter

  • OTHER_CLOSING_CURLY_BRACKET: Indicates the closing curly bracket delimiter

  • OTHER_CLOSING_PARENTHESIS: Indicates the closing parenthesis delimiter

  • OTHER_CLOSING_SQUARE_BRACKET: Indicates the closing square bracket delimiter

  • OTHER_EXCLAMATORY: Indicates the exclamatory mark delimiter

  • OTHER_MINUS: Indicates the minus delimiter

  • OTHER_OPENING_CURLY_BRACKET: Indicates the opening curly bracket delimiter

  • OTHER_OPENING_PARENTHESIS: Indicates the opening parenthesis delimiter

  • OTHER_OPENING_SQUARE_BRACKET: Indicates the opening square bracket delimiter

  • OTHER_PIPE: Indicates the pipe delimiter

  • OTHER_PLUS: Indicates the plus delimiter

  • OTHER_QUESTION_MARK: Indicates the question mark delimiter

  • OTHER_STAR: Indicates the star symbol delimiter

  • OTHER_TILDE: Indicates the tilde symbol delimiter

  • SEMI_COLON_DELIMITER: Indicates the semicolon delimiter

-df

The date format that's used in your source CSV file. The default value is ORA_ZCA_US_FORMAT. Possible values are:

  • ORA_ZCA_EUROPE_FORMAT: EU - YYYY/MM/DD

  • ORA_ZCA_ISO8601_FORMAT: ISO 8601 - YYYY-MM-DD

  • ORA_ZCA_UK_CANADA_FORMAT: UK/Canada - DD/MM/YYYY

  • ORA_ZCA_US_FORMAT: US - MM/DD/YYYY

-ds

The decimal separator is used in your source file. The default value is Period (.). Possible values are:

  • , (Comma)

  • . (Period)

-ecl

Enable or disable custom business logic for the import job. The default value is false. Possible values are:

  • true

  • false

-encoding

The source file encoding. Possible values are:

  • ORA_ZCA_UTF8_ENCODING

  • ORA_ZCA_UTF16_ENCODING

  • ORA_ZCA_UTF16BE_ENCODING

  • ORA_ZCA_UTF16LE_ENCODING

  • UTF8

getlogs

Get log files of an import activity.

-help

Display help text containing possible commands.

help

Show the usage message.

-high-volume

Activate High-Volume Import for supported objects.

-import-activity (required)

The import activity identifier of the job. This is used to track the status of the import job.

-import -name

The name of the import job.

-input -file (required)

The name of source CSV file. Provide a file name within 40 characters.

-limit

The import limit.

listimports

Get the list of import activities.

listobjects

List the objects supported by External Data Loader Client.

-log-type (required)

Specify the type of log file to generate. Possible values are Exception, Error, Diagnostic, and All.

-mapping-number (required)

The mapping number.

-mode

The import mode which is to be used for the import job. The default mode is CREATE_RECORD. Possible values are:

  • CREATE_ONLY: Create records

  • CREATE_RECORD: Update and create records

  • DELETE_RECORD: Delete records

  • UPDATE_RECORD: Update records

-nemail

The email address to which the import job status would be sent.

-object (required)

The object to import.

-offset

The import offset.

-s

The mode of scheduling the import job which determines whether the import job executes immediately or at a future date. The default value is ONE_TIME_IMMEDIATE. Possible values are:

  • ONE_TIME_IMMEDIATE: Immediate

  • ONE_TIME_SCHEDULE: Future

savepassword

Save credential in credential store wallet.

split

Split the import data file into multiple smaller files.

-split-records

The Split Size parameter is used to configure the split size that's used by EDLC. You can use either a low-volume split size or a high-volume split size.

Low-volume split has a default split size value of 50000 records, and following are the possible values:

  • 10000

  • 20000

  • 30000

  • 40000

  • 50000

Example command:

> java -jar oscdataloader.jar submit -input-file Account.csv -import-name "Account Import"-object Account -mapping-number 123565 -split-records 30000

High-volume split has a default split size value of 500000 records, and following are the possible values:

  • 100000

  • 200000

  • 300000

  • 400000

  • 500000

Example command:

> java -jar oscdataloader.jar submit -input-file Account.csv -import-name "Account Import" -object Account -mapping-number 123565 -high-volume-split-records 200000
Note: You can use this parameter when the default values slow down the performance or if an optimum size is recommended for the data pattern being imported. This parameter is provided to reduce the number of records that need to split for each EDLC job.

-st

Scheduled future time stamp for an import activity job.

status

Retrieve the status of the specified import.

submit

Submit file import request in CX Sales and B2B Service.

-target-url (required)

The environment against which the command is run.

-tsf

The time stamp format used in your source file. The default value is ORA_ZCA_US_FORMAT. Possible values are:

  • ORA_ZCA_CANADA_FORMAT: Canada - DD/MM/YYYY hh:mm:ss PM

  • ORA_ZCA_EUROPE_FORMAT: EU - YYYY/MM/DD 24:mm:ss

  • ORA_ZCA_ISO8601_FORMAT: ISO 8601 - YYYY-MM-DDThh:mm:ss.sss+/-hh:mm

  • ORA_ZCA_UK_FORMAT: UK -DD/MM/YYYY 24:mm:ss

  • ORA_ZCA_US_FORMAT: US - MM/DD/YYYY hh:mm:ss PM

-username (required)

Specify the user name of the user running the command.

-wallet

The Credential Store that holds one set of user credentials (user name and password pair).