Introduction

Automate your HCM Data Loader (HDL) and HCM Spreadsheet Data Loader (HSDL) integrations using the dataLoadDataSets REST API. This resource is available from release 24B.

Objectives

In this tutorial, you will:

  • Understand how to upload files to the Oracle WebCenter Content server.
  • Initiate HCM Data Loader and HCM Spreadsheet Data Loader.
  • Monitor Data Set status and retrieve messages.

Prerequisites

To complete the steps in this tutorial, you will need:

  • Access to the dataLoadDataSets resource.

    Grant the role hierarchy Use REST Service - Data Load Data Sets.

  • Access to the hcm$/dataloader$/import$ account on the Oracle WebCenter Content server.

    Grant the Upload data for Human Capital Management file based import role hierarchy.

  • Postman
  • The examples in this tutorial use Postman variables to reference the environment and other values that change.

    Variable Description
    {{env}} The environment path.
    {{user}} The username.

  • The ability to encode files with Base 64 encoding.

Note:

Other variables are displayed in blue with singular bracket. For example, {file_name}.

Task 1: Upload Your Data File to the Oracle WebCenter Content Server

There are multiple ways of uploading files to the Oracle WebCenter Content server, this task provides two different REST options; the uploadFile custom action available on the HCM Data Loader dataLoadDataSets resource (available from release 24B), and the Oracle WebCenter REST resource.

In this step you'll upload zip and csv files to the Oracle WebCenter Content server.

If you want to complete the steps that follow you can download and use these files:


HCM Data Loader uploadFile Action

  1. Open Postman and create a new POST request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/action/uploadFile
  3. Specify the following JSON Body:
  4. {
    "content" : "UEsDBBQAAAAIAJF+Qle0P8ZqHgEAAAQFAAAJAAAAR3JhZGUuZGF0ldPNaoQwEAfw+4LvsG+wmhhKDz2ERGyg2S269LqkGovQKqRxT3n4xu6Hxj3IQECZTH4h/BmZHSmnR+pyo2rtsqbRlW3PurTKWK7srJR19Vi4dLLed5fa+u+1slc/2tHbYTv8urIfTKUL3cRx8vLfsx3bo43Mijy73pg8k2QXj8ulTwna+YUT98rfTnnBT9KX2UHKw95J1akvbbaJo/NtmIaWGgo0BNPwUsOBhmFautTSQEthGllqJNBItIF4gt1zEF3dntt6UN8+zc6a9nOw/SWWqRsUi2BoBUcBDkpJMLyC4wAHhSZYuoKnAQ7KUDCygpMAB0bKp9HirdHVMkQOmy0eowcOzTnYcPEYB9w9qGkf9tyP95vn/+isHG3+AFBLAwQUAAAACAAbSlJXX/ch5FMCAAC9CgAABwAAAEpvYi5kYXTNldFumzAYhe8r9R14g5Fut7uwwElYY0LBq7SryCNOZI1AZUy1SH74GQdDnJCY9mpSpIjkP+ezzzGAIAYhwED+qH5LuNvRXLB3mgnCRUgElRkVQbWl7d/6OyYHfTEnB1Yc9U/ASERTy3lTFIkSY6bmUrpvCsIxPbxVnPCjFjalmq9KLUWkJHvKV/SdFjKrGp7TlO58f/ZdTXrtyOMDgukC6uU9+b7/xZ+pjwzWCK1juQxXmwysYLYJ1nHWXnoZKWitpGXdFIKUwhuGJJDzn6vVBkcIylTKC/kEVJxe4WjJKn5FdUEto/8ZrGfRIj3LtivNJmGFQCAGCzmzhVMZYXTOCBlXJ7HiN7dzYj3ZBm4WSp/xkCEi/A8VrNxfBohA+gxxFC9GQ+xN3Lw2b5vZ9fZJtOXnxutR096APG/wFu+8TGMzEWiqHIBWnfeJT5aPm5ik61CtDcVYMxNebZtcdFs80C5VM4VHUx083Ly2gQtmV+mn0LadGz/M9r1ec62Cb/DhwqrY9v3QOkzdI+uwb+P27F7ncF677epexWuiya8spwpPa7Y1qd+Ffe3EbgDsCPAvzZv2Nfdh1jfj44a9BBsMMwxPcb40pGDi2D4fBK8KD9NaqEpHT1GvnAQxB+eSYE6NBAG++Sw46d2cCG9AiCJ94UXYA9sDK1ktOGnPw+gujMRtvlRvsQQG2nuZetkbzZnaSy3GjbvxSb7DopXxhEUbySRzc6so6/7WuOU66QZIwK8hiYQcVY2FMw4jmmY/BGL8J6TSC6cx+kdIR7gfTqd4fPgHUEsBAhQAFAAAAAgAkX5CV7Q/xmoeAQAABAUAAAkAJAAAAAAAAAAgAAAAAAAAAEdyYWRlLmRhdAoAIAAAAAAAAQAYANn4TRNA9dkBAAAAAAAAAAAAAAAAAAAAAFBLAQIUABQAAAAIABtKUldf9yHkUwIAAL0KAAAHACQAAAAAAAEAIAAAAEUBAABKb2IuZGF0CgAgAAAAAAABABgAjuYejz/12QEAAAAAAAAAAAAAAAAAAAAAUEsFBgAAAAACAAIAtAAAAL0DAAAAAA==",
    "fileName" : "Grades and Jobs.zip"
    }

    Note:

    The value of the "content" is the Base64 encoded version of the New Grades and Jobs.zip file.

    The content and fileName parameters are required. The following parameters are available for the uploadFile custom action:

    Parameter Description
    content Base64 encoded file to be loaded.
    fileName Name of the file.
    contentId Unique identifier of the file on the Oracle WebCenter content server. Generated, when not supplied.

  5. Configure the Authorization for the request. The user must have the Upload data for Human Capital Management file based import role hierarchy.
  6. Set the Content-Type header to the value:
    application/vnd.oracle.adf.action+json
  7. Click Send.
  8. The response will have this format:

    {
        "result": {
            "Status": "SUCCESS",
            "ContentId": "UCMFA00083489"
        }
    }
  9. Review the response and make note of the Content ID. It's needed to identify the file when initiating HDL and HSDL.
  10. Base64 encode the Jobs.csv file and use these steps to upload the csv file.

Oracle WebCenter REST

The advantage of the Oracle WebCenter REST resource is that you don't have to first encode your file and therefore this REST service will be able to load larger source files. However, it's a bit more complex to use.

The HCM Data Loader uploadFile action will only upload files to the HCM Data Loader import account on the Oracle WebCenter. This API will allow you to upload to any Oracle WebCenter account your user has access to.

Create the Request

  1. Open Postman and create a new POST request for this URL:
  2. https://{{env}}/documents/files/data
  3. Specify the following form-data in the Body
  4. Key Value
    metadataValues
    {
    "dDocAuthor":"{{user}}",
    "dDocTitle":"{file_name}",
    "dSecurityGroup":"FAFusionImportExport",
    "dDocAccount":"hcm$/dataloader$/import$",
    "dDocType":"Document"
    }
    primaryFile

  5. Hover over the primaryFile key to expose the Text/File choice list and set the value to File.
    Hover over the primaryFile entry to see the Text/File choice list.

  6. Configure the Authorization for the request. The user must have the Upload data for Human Capital Management file based import role hierarchy.
  7. Save the request. You can now upload files.

Upload Files

In this step you'll use the request created above to upload zip and csv files to the Oracle WebCenter Content server.

  1. In the Body form-data of the request, click Select Files
    Hover over the primaryFile entry to see the Text/File choice list.

    Tip:

    The Select File button is only available when the primaryFile choice list is set to File.
  2. Use the File Browser to find and select your file.
  3. Hover over the primaryFile entry to see the Text/File choice list.

  4. Edit the metadataValues key dDocTitle value to specify the name for your file.
    Hover over the primaryFile entry to see the Text/File choice list.

    Tip:

    When you initiate HCM Data Loader the data set name is defaulted to the value you supply here.
  5. Click Send.
  6. The response will have this format:

    {
        "dWebExtension": "zip",
        "dRevClassID": "213206",
        "createdBy": "VISION_INTEGRATION",
        "modifiedBy": "VISION_INTEGRATION",
        "XFND_RANDOM": "4641327459804635554",
        "dReleaseState": "N",
        "dPublishState": "",
        "dID": "213551",
        "xComments": "",
        "errorCode": "0",
        "title": "Grades and Jobs.zip",
        "size": "1125",
        "createdTime": "10/3/23 7:27 AM",
        "dDocName": "UCMFA00213206",
        "mimeType": "application/zip",
        "dRevRank": "0",
        "dDocID": "426034",
        "name": "New Grades and Jobs.zip",
        "dDocAccount": "hcm$/dataloader$/import$",
        "ownedBy": "VISION_INTEGRATION",
        "xStorageRule": "FusionStorageRule",
        "dStatus": "DONE",
        "modifiedTime": "10/3/23 7:27 AM",
        "XFND_SIGNATURE": "Y-Qs7rQwyYANlmfo-....",
        "errorKey": "!csServiceStatusMessage_checkin,UCMFA00213206",
        "dExtension": "zip",
        "dWorkflowState": "",
        "XFND_SCHEME_ID": "1",
        "XFND_CERT_FP": "901B32887DDC81F780757624",
        "dPublishType": "",
        "dUser": "VISION_INTEGRATION",
        "dSecurityGroup": "FAFusionImportExport",
        "errorMessage": "Successfully checked in content item 'UCMFA00213206'.",
        "dRevisionID": "1",
        "version": "1"
    }
  7. Review the response and make note of the Content ID. It's needed to identify the file when initiating HDL and HSDL.
  8. Note:

    The Oracle WebCenter Content REST API uses the dDocName label for returning the Content ID.
  9. Repeat the Upload File steps for the csv file.


Task 2: Initiate Bulk Data Loading

Upload Data with HCM Data Loader

In this step you'll initiate HCM Data Loader to process a file previously loaded to the hcm$/dataloader$/import$ account on the Oracle WebCenter Content server.

  1. In Postman create a new POST request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/action/createFileDataSet
  3. Specify the following JSON Body:
  4. {
        "contentId" : "{content_id}"
    }

    Tip:

    Replace the {content_id} value with the Content ID of your file.

    The only required parameter is contentId. The following parameters are available for the createFileDataSet custom action:

    Parameter Description Configure HCM Data Loader parameter that provides the default value
    contentId Unique identifier of the file on the Oracle WebCenter content server.
    fileAction Valid values are IMPORT_AND_LOAD and IMPORT_ONLY. File Action
    dataSetName Name of the data set. Defaults to the document title of the file.
    importConcurrentThreads Maximum number of concurrent threads to assign to the import process. Maximum Concurrent Threads for Import
    loadConcurrentThreads Maximum number of concurrent threads to assign to the load process. Maximum Concurrent Threads for Load
    importMaxErrorPercentage Maximum percentage of records in error before the import ceases. Maximum Percentage of Import Errors
    loadMaxErrorPercentage Maximum percentage of records in error before the load ceases. Maximum Percentage of Load Errors
    fileEncryption Encryption used for the source file.

    Supply NONE, PGPUNSIGNED or PGPSIGNED.

    Tip:

    You're always recommended to encrypt your files before uploading them to the Oracle WebCenter. The HCM Data Loader import account is often accessed by various users, all of which can download and review the content of your unencrypted files.

    File Encryption
    verificationKey The verification key used for the source file encryption. Supply when fileEncryption is PGPSIGNED.
    deleteSourceFile Indicates if the source file should be deleted after the data is imported into the staging tables. Delete Source File

    Note:

    The last column in the table specifies the parameter name in the Configure HCM Data Loader task that defines the default for each REST API parameter.
  5. Configure the Authorization for the request. The user must have the Use REST Service - Data Load Data Sets role hierarchy.
  6. Set the Content-Type header to the value:
    application/vnd.oracle.adf.action+json
  7. Click Send.
  8. The response will have this format:

    {
        "result": {
            "Status": "SUCCESS",
            "RequestId": "29796"
        }
    }

    Note:

    Use the RequestId from the response to monitor the status of the data set using the dataLoadDataSets resource.


Upload Data with HCM Spreadsheet Data Loader

In this step you'll initiate HCM Spreadsheet Data Loader to process a file previously loaded to the hcm$/dataloader$/import$ account on the Oracle WebCenter Content server.

The file must be shaped for the HSDL template you're using to upload the file with.

  1. In Postman create a new POST request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/action/createSpreadsheetDataSet
  3. Specify the following JSON Body:
  4. {
        "contentId" : "{content_id}",
        "templateCode" : "ORA_PER_JOB"
    }

    Tip:

    Replace the {content_id} value with the Content ID of your file.

    Tip:

    The ORA_PER_JOB template is a preconfigured template available for loading jobs. You can find template codes using the Run Spreadsheet Data Loader task.

    The only required parameters are contentId and templateCode. The following parameters are available for the createSpreadsheetDataSet custom action:

    Parameter Description Default Value
    contentId Unique identifier of the file on the Oracle WebCenter content server.
    templateCode Spreadsheet template code to upload the file with.
    fileAction Valid values are IMPORT_AND_LOAD and IMPORT_ONLY.

    Tip:

    If you only import the file, you can't monitor or load it with REST, instead generate a spreadsheet for the template, fetch the data set and upload it from the spreadsheet.
    "IMPORT_AND_LOAD"
    dataSetName Name of the data set. Defaults to the template code and timestamp.
    importConcurrentThreads Maximum number of concurrent threads to assign to the import process. Defaulted from the Maximum Concurrent Threads for Import parameter.
    loadConcurrentThreads Maximum number of concurrent threads to assign to the load process. Defaulted from the Maximum Concurrent Threads for Load parameter.
    importMaxErrorPercentage Maximum percentage of records in error before the import ceases. Defaulted from the Maximum Percentage of Import Errors parameter.
    loadMaxErrorPercentage Maximum percentage of records in error before the load ceases. Defaulted from the Maximum Percentage of Load Errors parameter.
    fileEncryption Encryption used for the source file.

    Supply NONE, PGPUNSIGNED or PGPSIGNED.

    Tip:

    You're always recommended to encrypt your files before uploading them to the Oracle WebCenter. The HCM Data Loader import account is often accessed by various users, all of which can download and review the content of your unencrypted files.

    File Encryption
    verificationKey The verification key used for the source file encryption. Supply when fileEncryption is PGPSIGNED.
    deleteSourceFile Indicates if the source file should be deleted after the data is imported into the staging tables. Delete Source File
    headerIncludedFlag Indicates if a header is included in the source file to name the attributes included in the file. "Y"
    attributeDelimiter Characters used to separate values in the file. "," (comma).
    newLineIndicator Characters used to indicate a new line. "n", prefixed with the escape character.
    escapeIndicator Characters used to escape the delimiter characters within an attribute value. "/" (backslash)
    dateFormat Date format used for attributes with the date data type The default format is YYYY/MM/DD.

  5. Configure the Authorization for the request. The user must have the Use REST Service - Data Load Data Sets role hierarchy.
  6. Set the Content-Type header to the value:
    application/vnd.oracle.adf.action+json
  7. Click Send.
  8. The response will have this format:
    {
        "result": {
            "Status": "SUCCESS",
            "RequestId": "222125",
            "UserInfo": "",
            "Review": "https://{env}/hcmUI/oracle/apps/hcm/enterpriseSetup/hdlSpreadsheetLoader/di/GenericHdlSpreadsheet.xlsx?layoutCode=ORA_PER_JOB&dataSetName=Job#2023-10-03 13:25:59",
            "FileAction": "IMPORT_AND_LOAD",
            "DataSetName": "Job#2023-10-03 13:25:59"
        }
    }

    Note:

    Use the RequestId from the response to monitor the status of the data set using the dataLoadDataSets resource.

    Tip:

    Use the Review link to generate a spreadsheet from the HSDL template to review the data set.


Task 3: Monitor Data Set Status

In this step you'll monitor data set status using the dataLoadDataSets resource. This resource can be used to monitor both HDL and HSDL data sets.

Tip:

HSDL files that are only imported don't create a data set in the HCM Data Loader staging tables and can't be monitored using REST.

Monitor a Specific Data Set

  1. In Postman create a new GET request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}

    Note:

    Replace {RequestId} with the RequestId value returned when you submitted your file.
  3. Configure the Authorization for the request and click Send.
  4. The response will have this format:

    {
        "RequestId": 29796,
        "DataSetId": 300100583119028,
        "ContentId": "UCMFA00025146",
        "DataSetName": "Grades and Jobs.zip",
        "DataSetStatusCode": "ORA_IN_ERROR",
        "DataSetStatusMeaning": "Error",
        "TransferStatusCode": "SUCCESS",
        "TransferStatusMeaning": "Success",
        "ImportStatusCode": "SUCCESS",
        "ImportStatusMeaning": "Success",
        "LoadStatusCode": "ERROR",
        "LoadStatusMeaning": "Error",
        "SourceTypeCode": "ZIP_DAT_FILE",
        "SourceTypeMeaning": "Compressed DAT file",
        "IntegrationTypeCode": null,
        "IntegrationTypeMeaning": null,
        "ImportPercentageComplete": 100,
        "LoadPercentageComplete": 100,
        "ImportSuccessPercentage": 100,
        "LoadSuccessPercentage": 59,
        "FileLineTotalCount": 38,
        "FileLineImportErrorCount": 0,
        "FileLineImportSuccessCount": 38,
        "ObjectTotalCount": 37,
        "ObjectLoadErrorCount": 15,
        "ObjectRollbackErrorCount": 0,
        "ObjectSuccessCount": 22,
        "ObjectUnprocessedCount": 0,
        "SpreadsheetTemplateCode": null,
        "SpreadsheetTemplateName": null,
        "SpreadsheetTemplateUserTypeCode": null,
        "SpreadsheetTemplateUserTypeMeaning": null,
        "SpreadsheetMessage": null,
        "FileSize": 1175,
        "CreatedBy": "VISION_INTEGRATION",
        "CreationDate": "2023-10-19T09:46:23+00:00",
        "LastUpdatedBy": "VISION_INTEGRATION",
        "LastUpdateDate": "2023-10-19T09:48:00.720+00:00",
        "ProtectedFlag": false,
        "Review": null,
        "links": [
        ...
        ]
    }
  5. Repeat for the csv file, the response will return information for the Spreadsheet related elements.

Identify Data Sets in Error

Use this URL to retrieve summary information for all data sets in error:

https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets?q=DataSetStatusCode=ORA_IN_ERROR&fields=RequestId,DataSetName,DataSetStatusCode,ImportStatusCode,LoadStatusCode,SourceTypeCode,FileLineTotalCount,FileLineImportErrorCount,ObjectTotalCount,ObjectLoadErrorCount,LastUpdateDate&onlyData=true

The response will have this format:

{
    "items": [
        {
            "RequestId": 29796,
            "DataSetName": "Grades and Jobs.zip",
            "DataSetStatusCode": "ORA_IN_ERROR",
            "ImportStatusCode": "SUCCESS",
            "LoadStatusCode": "ERROR",
            "SourceTypeCode": "ZIP_DAT_FILE",
            "FileLineTotalCount": 38,
            "FileLineImportErrorCount": 0,
            "ObjectTotalCount": 37,
            "ObjectLoadErrorCount": 15,
            "LastUpdateDate": "2023-10-19T09:48:00.720+00:00"
        },
        {
            "RequestId": 29650,
            "DataSetName": "1417610_PayrollElementDefinition_Create.zip",
            "DataSetStatusCode": "ORA_IN_ERROR",
            "ImportStatusCode": "ERROR",
            "LoadStatusCode": "SUCCESS",
            "SourceTypeCode": "ZIP_DAT_FILE",
            "FileLineTotalCount": 30,
            "FileLineImportErrorCount": 2,
            "ObjectTotalCount": 30,
            "ObjectLoadErrorCount": 2,
            "LastUpdateDate": "2023-10-19T09:04:53.932+00:00"
        },...

Monitor Data Sets for a Specific HSDL Template

Use this URL to retrieve summary information for all data sets loaded with a specific spreadsheet template:

https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets?q=SpreadsheetTemplateCode=ORA_PER_JOB&fields=RequestId,DataSetName,ImportStatusCode,LoadStatusCode,SourceTypeCode,ObjectTotalCount,ObjectLoadErrorCount,Review&onlyData=true

The response will have this format:

{
    "items": [
       {
            "RequestId": 222125,
            "DataSetName": "Job#2023-10-03 13:25:59",
            "ImportStatusCode": "SUCCESS",
            "LoadStatusCode": "UNPROCESSED",
            "SourceTypeCode": "SPREADSHEET",
            "ObjectTotalCount": 6,
            "ObjectLoadErrorCount": 0,
            "Review": "https://{env}/hcmUI/oracle/apps/hcm/enterpriseSetup/hdlSpreadsheetLoader/di/GenericHdlSpreadsheet.xlsx?layoutCode=ORA_PER_JOB&dataSetName=Job#2023-10-03 13:25:59"
        },
        {
            "RequestId": 219613,
            "DataSetName": "Job#2023-10-03 11:39:14",
            "ImportStatusCode": "SUCCESS",
            "LoadStatusCode": "ERROR",
            "SourceTypeCode": "SPREADSHEET",
            "ObjectTotalCount": 30,
            "ObjectLoadErrorCount": 2,
            "Review": "https://{env}/hcmUI/oracle/apps/hcm/enterpriseSetup/hdlSpreadsheetLoader/di/GenericHdlSpreadsheet.xlsx?layoutCode=ORA_PER_JOB&dataSetName=Job#2023-10-03 11:39:14"
        },
        {
            "RequestId": 66269,
            "DataSetName": "Job#2023-09-25 09:28:18",
            "ImportStatusCode": "SUCCESS",
            "LoadStatusCode": "WARNING",
            "SourceTypeCode": "SPREADSHEET",
            "ObjectTotalCount": 0,
            "ObjectLoadErrorCount": 0,
            "Review": "https://{env}/hcmUI/oracle/apps/hcm/enterpriseSetup/hdlSpreadsheetLoader/di/GenericHdlSpreadsheet.xlsx?layoutCode=ORA_PER_JOB&dataSetName=Job#2023-09-25 09:28:18"
        }


Task 4: Monitor Business Object Status

When your HDL data set has multiple business object files you may want to monitor the status of the individual business objects.

Review All Business Objects in a Data Set

  1. In Postman create a new GET request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}/child/businessObjects?onlyData=true

    Note:

    Replace {RequestId} with the RequestId value returned when you submitted your file.
  3. Configure the Authorization for the request and Click Send.
  4. The response will have this format:

    {
        "items": [
            {
                "DataSetBusObjId": 300100583119051,
                "BusinessObjectDiscriminator": "Grade",
                "BusinessObjectName": "Grade",
                "DatFileName": "Grade.dat",
                "LoadOrder": 1,
                "BusinessObjectId": 300100028324439,
                "TransferStatusCode": "SUCCESS",
                "TransferStatusMeaning": "Success",
                "ImportStatusCode": "SUCCESS",
                "ImportStatusMeaning": "Success",
                "LoadStatusCode": "SUCCESS",
                "LoadStatusMeaning": "Success",
                "ImportPercentageComplete": 100,
                "LoadPercentageComplete": 100,
                "ImportSuccessPercentage": 100,
                "LoadSuccessPercentage": 100,
                "FileLineTotalCount": 14,
                "FileLineImportErrorCount": 0,
                "FileLineImportSuccessCount": 14,
                "ObjectTotalCount": 14,
                "ObjectLoadErrorCount": 0,
                "ObjectRollbackErrorCount": 0,
                "ObjectSuccessCount": 14,
                "ObjectUnprocessedCount": 0,
                "DataSetId": 300100583119028,
                "RequestId": 29796,
                "CreatedBy": "VISION_INTEGRATION",
                "CreationDate": "2023-10-19T09:46:24+00:00",
                "LastUpdateDate": "2023-10-19T09:47:19.989+00:00",
                "LastUpdatedBy": "FUSION_APPS_HCM_ESS_LOADER_APPID",
                "RollbackEnabledFlag": false,
                ...


Identify Business Objects in Error

Use this URL to retrieve summary information for all business objects in the data set that errored during load:

https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}/child/businessObjects?q=LoadStatusCode=ERROR&fields=DataSetBusObjId,LoadOrder,BusinessObjectName,TransferStatusCode,ImportStatusCode,LoadStatusCode,FileLineTotalCount,FileLineImportErrorCount,ObjectTotalCount,ObjectLoadErrorCount&onlyData=true

The response will have this format:

{
    "items": [
        {
            "DataSetBusObjId": 300100583119052,
            "LoadOrder": 2,
            "BusinessObjectName": "Job",
            "TransferStatusCode": "SUCCESS",
            "ImportStatusCode": "SUCCESS",
            "LoadStatusCode": "ERROR",
            "FileLineTotalCount": 24,
            "FileLineImportErrorCount": 0,
            "ObjectTotalCount": 23,
            "ObjectLoadErrorCount": 15
        }
    ]...


Task 5: Retrieve Messages

When your data set has errors, you can retrieve the messages raised for it along with identifiers for the records each message is reported against.

Review All Messages for a Data Set

  1. In Postman create a new GET request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}/child/messages?totalResults=true&orderBy=DatFileName,FileLine&fields=DatFileName,BusinessObjectDiscriminator,OriginatingProcessCode,FileLine,ConcatenatedUserKey,SourceSystemOwner,SourceSystemId,SourceReference001,MessageTypeCode,MessageText,MessageUserDetails&onlyData=true

    Note:

    Replace {RequestId} with the RequestId value returned when you submitted your file.
  3. Configure the Authorization for the request and click Send.
  4. The response will have this format:

    {
        "items": [
            {
                "DatFileName": "Job.dat",
                "BusinessObjectDiscriminator": "Job",
                "OriginatingProcessCode": "LOAD",
                "FileLine": 2,
                "ConcatenatedUserKey": "HDL_SALES_CONS-COMMON",
                "SourceSystemOwner": null,
                "SourceSystemId": null,
                "SourceReference001": "HDL_SALES_CONS",
                "MessageTypeCode": "ERROR",
                "MessageText": "You need to enter a valid value for the JobFamilyId attribute. The current values are HDL_SALES.",
                "MessageUserDetails": null
            },
            {
                "DatFileName": "Job.dat",
                "BusinessObjectDiscriminator": "Job",
                "OriginatingProcessCode": "LOAD",
                "FileLine": 4,
                "ConcatenatedUserKey": "HDL_SNR_SALES_CONS-COMMON",
                "SourceSystemOwner": null,
                "SourceSystemId": null,
                "SourceReference001": "HDL_SNR_SALES_CONS",
                "MessageTypeCode": "ERROR",
                "MessageText": "You need to enter a valid value for the JobFamilyId attribute. The current values are HDL_SALES.",
                "MessageUserDetails": null
            },
            {
                "DatFileName": "Job.dat",
                "BusinessObjectDiscriminator": "Job",
                "OriginatingProcessCode": "LOAD",
                "FileLine": 5,
                "ConcatenatedUserKey": "HDL_SALES_MGR-COMMON",
                "SourceSystemOwner": null,
                "SourceSystemId": null,
                "SourceReference001": "HDL_SALES_MGR",
                "MessageTypeCode": "ERROR",
                "MessageText": "The FT value for the FullPartTime attribute is invalid and doesn't exist in the  list of values.",
                "MessageUserDetails": null
            },
    ...
    


Task 6: Retrieve Submitted Process Details

Typically, you would see a record for the transfer, import and load processes for each business object in your data set. There will be additional processes if you have submitted further processing of your data set.

  1. In Postman create a new GET request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}/child/processes?onlyData=true
  3. Configure the Authorization for the request and click Send.
  4. The response will have this format:

    {
        "items": [
            {
                "ProcessId": 300100583119105,
                "DataSetId": 300100583119028,
                "DataSetBusObjId": 300100583119051,
                "ProcessCode": "ORA_TRANSFER_OBJECT",
                "ProcessName": "Transfer Business Object",
                "DatFileName": "Grade.dat",
                "BusinessObjectName": "Grade",
                "FileActionCode": null,
                "FileActionMeaning": null,
                "TotalCount": 14,
                "SuccessCount": 14,
                "ErrorCount": 0,
                "UnprocessedCount": 0,
                "StartTime": "2023-10-19T09:46:28.216+00:00",
                "EndTime": "2023-10-19T09:46:47.056+00:00",
                "ElapsedTime": "+000000000 00:00:18.840000000",
                "ThreadsAllocated": 2,
                "ThreadsUsed": 1,
                "RequestId": 29802,
                "CreatedBy": "FUSION_APPS_HCM_ESS_LOADER_APPID",
                "CreationDate": "2023-10-19T09:46:28.042+00:00",
                "LastUpdateDate": "2023-10-19T09:46:48.180+00:00",
                "LastUpdatedBy": "FUSION_APPS_HCM_ESS_LOADER_APPID",
                "ParentRequestId": 29796
            },
            {
                "ProcessId": 300100583129784,
                "DataSetId": 300100583119028,
                "DataSetBusObjId": 300100583119051,
                "ProcessCode": "ORA_IMPORT_OBJECT",
                "ProcessName": "Import Business Object",
                "DatFileName": "Grade.dat",
                "BusinessObjectName": "Grade",
                "FileActionCode": null,
                "FileActionMeaning": null,
                "TotalCount": 14,
                "SuccessCount": 14,
                "ErrorCount": 0,
                "UnprocessedCount": 0,
                "StartTime": "2023-10-19T09:46:49.261+00:00",
                "EndTime": "2023-10-19T09:46:52.874+00:00",
                "ElapsedTime": "+000000000 00:00:03.613000000",
                "ThreadsAllocated": 2,
                "ThreadsUsed": 1,
                "RequestId": 29804,
                "CreatedBy": "VISION_INTEGRATION",
                "CreationDate": "2023-10-19T09:46:48.044+00:00",
                "LastUpdateDate": "2023-10-19T09:46:54.089+00:00",
                "LastUpdatedBy": "FUSION_APPS_HCM_ESS_LOADER_APPID",
                "ParentRequestId": 29796
            },
            {
                "ProcessId": 300100583119127,
                "DataSetId": 300100583119028,
                "DataSetBusObjId": 300100583119051,
                "ProcessCode": "ORA_LOAD_OBJECT",
                "ProcessName": "Load Business Object",
                "DatFileName": "Grade.dat",
                "BusinessObjectName": "Grade",
                "FileActionCode": "IMPORT_AND_LOAD",
                "FileActionMeaning": "Import and load",
                "TotalCount": 14,
                "SuccessCount": 14,
                "ErrorCount": 0,
                "UnprocessedCount": 0,
                "StartTime": "2023-10-19T09:46:55.292+00:00",
                "EndTime": "2023-10-19T09:47:20.823+00:00",
                "ElapsedTime": "+000000000 00:00:25.531829000",
                "ThreadsAllocated": 8,
                "ThreadsUsed": 1,
                "RequestId": 29805,
                "CreatedBy": "FUSION_APPS_HCM_ESS_LOADER_APPID",
                "CreationDate": "2023-10-19T09:46:55.051+00:00",
                "LastUpdateDate": "2023-10-19T09:47:20.442+00:00",
                "LastUpdatedBy": "FUSION_APPS_HCM_ESS_LOADER_APPID",
                "ParentRequestId": 29796
            },...


Task 7: Submit Processing of an Existing Data Set

If your HCM Data Loader file was only imported or you want to attempt to load previously failed records, you can submit load for your data set. If you have successfully loaded records, you can submit roll back for your data set. Roll back is only initiated for business objects that support the roll back action.

Tip:

The RollbackEnabledFlag element on the businessObjects resource indicates if the business object supports being rolled back.

Submit Processing of a Data Set

  1. In Postman create a new POST request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}/action/submit

    Note:

    Replace {RequestId} with the RequestId value returned when you submitted your file.
  3. Specify the following JSON Body:
  4. {
    "fileAction" : "LOAD"
    }

    The following parameters are available for this custom action:

    Parameter Description Default Value
    fileAction Valid values are LOAD and ROLLBACK.

    Tip:

    For HSDL files, if you only import the file, you can't load it with REST, instead generate a spreadsheet for the template, fetch the data set and upload it from the spreadsheet.
    "LOAD"
    loadConcurrentThreads Maximum number of concurrent threads to assign to the load process. Defaulted from the Maximum Concurrent Threads for Load parameter.

  5. Set the Content-Type header to the value:
    application/vnd.oracle.adf.action+json
  6. Configure the Authorization for the request and click Send.
  7. The response will have this format:
    {
        "result": {
            "Status": "SUCCESS",
            "RequestId": "127366"
        }
    }


Submit Processing for a Business Object within a Data Set

The submit custom action is also available at the business object level where it has the same parameters but only submits processing of the business object specified by the URL.

Use this URL to initiate processing for a specific business object.

https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}/child/businessObjects/{DataSetBusObjId}/action/submit

Note:

Replace {DataSetBusObjId} with the DataSetBusObjId value returned when you reviewed the business object within the data set.

Tip:

You can test this by Base64 encoding, uploading and submitting the JobFamily.zip file which creates the job families referenced by the Job.dat file in error above.

Once the job families are loaded successfully, initiate LOAD for the data set or Job business object within the data set as described above.



Task 8: Stop an In Progress Data Set or Business Object

You can request to stop in progress data sets and business objects.

Stop Processing of a Data Set

  1. In Postman create a new POST request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}/action/stop

    There are no parameters for this action.

  3. Set the Content-Type header to the value:
    application/vnd.oracle.adf.action+json
  4. Configure the Authorization for the request and click Send.
  5. If the data set isn't currently in progress, you will instead get this response:

    {
        "result": "You can't stop processing this data set because it isn't being processed. Check the current status of the data set."
    }


Stop Processing of a Business Object

The stop custom action is also available at the business object level. Use this URL to request processing to be stopped for a specific business object:

https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}/child/businessObjects/{DataSetBusObjId}/action/stop

Note:

Replace {DataSetBusObjId} with the DataSetBusObjId value returned when you reviewed the business object within the data set.

Task 9: Delete a Data Set

You can delete the staging table data for your data sets if they are not being processed.

Delete HDL Stage Table Data

  1. In Postman create a new POST request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}/action/deleteDataSet

    There are no parameters for this action.

  3. Set the Content-Type header to the value:
    application/vnd.oracle.adf.action+json
  4. Configure the Authorization for the request and click Send.

Delete HSDL Stage Table Data

Submitting this action will delete your data set from both the HSDL and HDL staging tables.

  1. In Postman create a new POST request for this URL:
  2. https://{{env}}/hcmRestApi/resources/11.13.18.05/dataLoadDataSets/{RequestId}/action/deleteSpreadsheetDataSet

    There are no parameters for this action.

  3. Set the Content-Type header to the value:
    application/vnd.oracle.adf.action+json
  4. Configure the Authorization for the request and click Send.


More Learning Resources

Explore other labs on docs.oracle.com/learn or access more free learning content on the Oracle Learning YouTube channel. Additionally, visit education.oracle.com/learning-explorer to become an Oracle Learning Explorer.

For product documentation, visit Oracle Help Center.