10 Python Application Programming Interface (API) packages specific to Data Studio tool

You use ORDS Python API to create and manage Analytic Views, perform insight operations and other operations on accessing and managing Oracle Database such as functions which support upload to the database files from local drive or Cloud Store, tables from database links.

Installing Python

The following steps will guide you through the process of installing Python.

  1. You must build Python from source code. You can download and install it with yum, as follows:
    # yum install gcc openssl-devel bzip2-devel
  2. Download a gzipped source tarball of the latest Python 3 release:
    # cd /usr/src
    # wget https://www.python.org/ftp/python/3.6.4/Python-3.6.4.tgz
  3. Unpack the package:
    # tar xzf Python-3.6.4.tgz
  4. Compile the source code and install Python:
    # cd Python-3.6.4
    # ./configure --enable-optimizations
    # make altinstall
  5. Locate the newly installed Python executable:
    #which python3.6/usr/local/bin/python3.6
  6. Launch the Python prompt to make sure it works:
    # python3.6
    Python 3.6.4 (default, Dec 20 2017, 20:48:42)
    [GCC 4.4.7 20120313 (Red Hat 4.4.7-18)] on linuxType "help", "copyright", "credits" or "license" for more information.

You now need to install, a Python HTTP library that simplifies working with HTTP requests, providing an easy-to-use interface to make REST calls from Python.

The following steps describes how you can install requests from source code:
  1. Download the Requests library as the source code from GitHub, where it is actively developed:
    "https://github.com/requests/requests/tarball/master"
  2. Unpack the downloaded gzipped source tarball in your system and run the cd command to change to the unpacked source directory.
  3. Install the request library using the pip command:
    $ pip ords_python-0.1.0-py3-none-any.whl  install
  4. After the installation completes, check to make sure it was successful:
    $ python3.6>>> import requests>>>

    This completes the process of setting up your working environment.

  5. To connect to ORDS via your username and password, run the following command:
    import adp ords = adp.login('<url>', '<username>', '<password>')
    Here are the parameters and their descriptions:

    url: This specifies ORDS host and port including protocol (http or https).

    username:This specifies the schema name.

    password: This specifies the password to the schema.

    The above command returns an autonomous database instance that contains all autonomous database functions.

Quick Start

Use the Python Request Library’s REST calls to manipulate data.
  1. In Python, you make a POST request with the requests.post method, sending the dictionary created in the previous step as a data parameter.
  2. Oracle REST Data Services processes the request, transforming it into a corresponding INSERT or UPDATE statement issued against the underlying data source. Then, it sends a response back to the caller, indicating the success or failure of the operation.
  3. In Python, the response is available in a Response object returned by the requests.get method, so that you can check to see if the operation succeeded.

Supported Methods

You can create, view, delete and manage Oracle Database REST API instances using standard HTTP method requests, as summarized in the following table.

Status Codes

In the case of an exception, all functions return the readable status of the RESTful call, but a normal call returns the data you request. You can view the HTTP status code, using the adp.get_rest().status_code.

Authentication procedure

The Python API allows two different processes for authentication. You must first login, this connects the tool to ORDS using a username and password. You can also use SODA for authentication. You first connect to the database through the Oracle Database Client python package (Cx_Oracle or oracledb) using a connection string, username, and password. This enables you to receive the connection instance. Use the connection instance, to receive the cursor instance and then call the connect() method. The second parameter of the connect() method is the URL. If you use a production instance where the connection port is 443 (default HTTPS), then you can skip the URL, and the Python API will construct the URL from the database call using the cursor.

Tasks

Analytics View functions: These services related to the tables and views that provide information about the Analytic views.

ords.Analytics.listAnalyticViews()

This returns list of Analytic View names.

ords.Analytics.createAnalyticView('SALES')

This creates Analytic View based on fact table. Measures and dependencies are selected automatically based on the columns of the fact table in the schema of owner.

ords.Analytics.dropAV('SALES_AV')

This deletes the Analytic view.

ords.Analytics.getMeasureList('SALES_AV')

This returns the metadata of all measures of the Analytic View in the schema of owner.

Analytics.getEntityDDL(entityType, entityName,
                      owner=None)

This generates DDL for creating or replacing entity in the schema of owner (if owner is missing or set to None, use the current schema). entityType can be 'TABLE', 'VIEW', or 'ANALYTIC_VIEW'

Analytics.getAnalyticViewDataPreview(entityName, owner=None)

This returns metadata of the Analytic View namely, hierarchies, measures, aggregated data, and sql query for selecting data from the Analytic View.

Analytics.getAnalyticViewMetadata(avName,
                    owner=None)

This returns detailed metadata of the Analytic View in the schema of owner (if owner is missing or set to None, use the current schema).

Analytics.getAnalyticViewQualityReport(avName, hierarchies,
                          owner=None)

This returns quality report for the Analytic View in the schema of owner (if owner is missing or set to None, use the current schema).

Analytics.getDimensionNames(avName)

This returns array of dimension names of the Analytic View.

Analytics.getFactTableName(avName)

This returns the name of fact table of the Analytic View.

Analytics.getErrorClassesFromDim(avName,
                    dimension)

This returns the information about errors in specified dimension during analyzing the Analytic View.

Analytics.getErrorClassesFromFactTab(avName,
                    factTab)

This returns the information about errors in specified fact table during analyzing the Analytic View.

Analytics.getFactTabErrorDetails(avName, factTab, errClass, errCol,
                          recCount=-1)

This returns detailed explanation of the error in fact table, errClass is taken from the response of getErrorClassesFromFactTab.

Analytics.getDimErrorDetails(avName, dimension, errClass,
                          recCount=-1)

This returns detailed explanation of the error in the dimension, errClass is taken from the response of getErrorClassesFromDim.

Analytics.getAnalyticViewData(levels, columnNames, entityName,
                          hierarchies, measures, whereCondition, owner=None)

This returns the aggregated data from the Analytic View in the schema of owner.

Analytics.getSQL(levels, columnNames, entityName, hierarchies, measures,
                          whereCondition, owner=None)

This returns SQL Query that is used in getAVData. All parameters are the same as in getAVData.

Analytics.downloadTableauWorkbook(avName,
                    fileName)

This extracts the Analytic View avName in Tableau workbook file fileName. The typical extension of the file is ".tds"

Analytics.downloadPowerBITemplate(avName,
                    fileName)

This extract the Analytic View avName in Microsoft PowerBI template file fileName. The typical extension of the file is ".pbit"

manifestDownload(fileName, os=None)

This downloads the installer for Excel manifest to the file fileName.

Miscellaneous functions

These services are related to miscellaneous activities such as returning the results of a query, generate data.

runQuery(statement, offset=None, limit=None,
                      asof=None)

This return the results of query.

generateData(table, dataField, skip = None, prefix = "Data")

This function plots the graph and generates two lists: categories that is the prefix plus serial number, and data that is the value of column dataField.

Table is the result of SELECT statement (see RunQuery or getAVData methods )

If skip is None or contains the column name, all records with None value will be omitted.

globalSearch(searchString, rowstart, numrow, sortBy=[],
                                      hideSystemTables=False, hidePrivateTables=False,
                                      resultapp=None, resultannotation=None)

This returns the result of searching databases, tables and another artifacts from the ORDS.

Load Data functions

These services related to the tables and views that provide information about loading data.

Tables and Views operations

listTables(owner=None)

This returns list of tables, owner is a schema name of the tables (None means that the current schema is used).

listViews(owner=None)

This returns list of views, owner is a schema name of the views (None means that the current schema is used).

dropTable(tableName)

This drops the table tableName.

dropView(viewName)

This drops the view viewName.

dropTables(tableNames)

This drops the tables from the list tableNames.

dropViews(viewNames)

This drops the views from the list viewNames.

Local files operations

The following loads data from the local files into the current database. It supports the following file formats: CSV, TSV, XML, JSON, XLSX, PARQUET.

loadFiles(files)

Here files are a list of dictionaries with the following fields:

{'fileName': fileName, 'targetTableName': targetTableName, 'options': options}

fileName is the full path to the data file.

targetTableName is the name of created table, it is the optional parameter, if it is missing or equal to None, use filename as table name. During processing the target table name may slightly rename if the current database already has this table. For example, instead of "DATA" the method will copy records to the table "DATA_1", if the original table exists.

The argument is optional for JSON and XLS formats. The following are the parameter names with their description:

  • Add constraint: This parameter adds a column constraint on creation of a table. This validates the varchar/clob column of json format. The default value is false.
  • sheet: This parameter specifies the name of the sheet or index of the sheet used for loading data. The default value is 0.

Database Link operations

The following methods are used to import tables from Database Link to current schema.

getDatabaseLinks(owner=None)

This returns list of available database links.

getDBLinkOwnerTables(dbLink)

This returns tables and views from the database link.

getTableDetails(owner, tableName, dbLink)

This returns detailed information about selected table in the database link.

copyDatabaseTablesFromDbLink(tables)

Load data from the specified tables into the current database.

linkDatabaseTablesFromDbLink(tables)

This creates views from the specified tables into the current database.

Cloud Storage Link operations

These methods creates cloud storage link, list objects in the cloud storage link, and import cloud objects to tables or external tables of the current schema.

createCloudStorageLink(storageLinkName, uri, credentialName, authUser,
                          authPassword, description = None)

Here are the parameters and their description:

storageLinkName: This specifies the name for the cloud storage link.

uri: This specifies the URL path.

credentialName: The credential name must conform to Oracle object naming conventions.

authUser: This is the OCI user name.

authPassword: This specifies the auth token.

description: This describes a description for the link. Use storageLinkName if the description is missing.

getCloudBuckets(owner=None)

This returns the list of cloud storage links.

getCloudObjects(storageLink, owner=None)

This returns the list of objects in the cloud storage link.

getConsumerGroups()

This returns the list of available consumer groups.

copyCloudObjects(objects, consumerGroup='LOW')

This copies cloud objects from the cloud storage link to the tables in the current schema.

Here are the parameters and their descriptions:

objects: This specifies the list of dictionaries with the following fields, namely, storageLink, objectName, and targetTableName.

consumerGroup: This specifies that defines the level of performance and concurrency during copying the cloud object. The values are HIGH, MEDIUM and LOW.

linkCloudObjects(objects, consumerGroup='LOW')

This creates external tables based on cloud objects from the cloud storage link.

getIngestJobStatus(job, owner=None)

This gets the status of the injest job.

Here is the parameter and it's description:

job: This specifies the output of copy cloud object.

Ingest Job operations
getIngestJobsList(owner=None)

This returns the list of all ingest jobs.

deleteIngestJob(owner, jobName)

This deletes ingest job by name.

listLiveTableFeed(owner=None)

This returns the list of Live Table Feed.

createScheduler(timeType='HOURLY', interval=12, weekDays = None, startDate = None, endDate =
                    None)

This creates scheduler parameters for live feed with the following arguments:

timeType: This specifies the time type of the interval. It can have the following values MINUTELY, HOURLY, DAYLY and WEEKLY.

interval: This specifies the interval to poll.

weekdays: This specifies the array of weekdays to poll.

startDate: This specifies the start date of the job in ISO 8601 format.

endDate: This specifies the end date of the job in ISO 8601 format.

createLiveFeedJob(cloudStorageLink, name='LIVE_TABLE_FEED', tableName='LIVE_TABLE', scheduler=None, objectFilter=None)

This creates live table feed.

Here are the parameters with their descriptions:

cloudStorageLink: This specifies the name of the cloud storage link.

name: This specifies the name of the live feed.

tableName: This specifies the table name of the live feed.

scheduler: This is the scheduler parameters.

objectFilter: This specifies the regular expression to limit the live table feed to only those files in the bucket that match the expression.

dropLiveFeed(liveFeedName)

This drops the live feed name.

getLiveFeedEvents(liveFeedName, owner=None)

This returns the list of events for the live feed liveFeedName.

getLiveFeedNotifyState(liveFeedName)

This gets the notification state of the livefeed.

setLiveFeedNotifyState(liveFeedName, state)

This sets the notification state of the Live Feed.

runLiveFeedOnce(liveFeedName)

This runs the Live Feed liveFeedName immediately.

suspendLiveFeed(liveFeedName)

This suspends the Live Feed liveFeedName.

resumeLiveFeed(liveFeedName)

This resumes the Live Feed liveFeedName.

Write back operations

These methods allows writing data into existing table.

Analytics.getTableColumns(tableName, owner=None, limit=256, offset=0)

This function returns the list of the columns information of the table. If owner is not used or equal to None, the current schema is used.

Analytics.getTableConstraints(tableName, owner=None, limit=256, offset=0)

This function returns the list of the constraints of the table tableName of the schema owner. If owner is not used or equal to None, the current schema is used.

Analytics.insertRowTable(tableName, data, mapping=None,
                            owner=None)

This function insert data data into the table tableName of the schema owner. If owner is not used or equal to None, the current schema is used. data has a form {"data column" : value,...}. To match data columns with table columns mapping json is used, skip this argument if data contains column names of the table.

Analytics.updateRowTable(tableName, data,  whereCol, mapping=None,
                            owner=None)

This function updates data data from the specified row of the table tableName of the schema owner. If owner is not used or equal to None, the current schema is used. data has a form {"data column" : value,...}, value of whereCol should be added to data too. To match data columns with table columns mapping json is used, skip this argument if data contains column names of the table.

Code Examples

The code examples in this section demonstrate using the ORDS API for Python to call Rest API.

Before running an example, your login credentials must be configured for a database instance and must be connected to the ORDS as described in Introduction section. Run these steps to achieve the pre-requisite.

import adp
ords = adp.login('<protocol://host:port>', 'schema_name', 'schema_password')

You can use the following examples to create and drop an analytic view, receive the name of the fact table associated with the analytic view, dimension name associated with the analytic view, receive error information from the, receive metadata of the analytic view and receive analytic view data:

  • Create the Analytic View
  • Drop the Analytic View
  • Get Fact table
  • Get Dimension Names
  • Get Error class from Fact table
  • Get Error class from Dimension
  • Get the Analytic View Metadata
  • Get the Analytic View Data

Create the Analytic View

ords.Analytics.createAnalyticView('SALES')
Output:'success'

Drop the Analytic View

ords.Analytics.dropAnalyticView('SALES_AV')

Get Fact table

ords.Analytics.getFactTableNames('SALES_AV')
Output:
{
"FACT_TABLE_NAME": "SALES"
}

Get Dimension Names

ords.Analytics.getDimensionNames('SALES_AV')

Output:[
{    "DIMENSION_NAME": "SALES_AV_CHANNELS_AD",
"CAPTION_NAME": "CHANNELS"},
{   "DIMENSION_NAME": "SALES_AV_CUST_ID_AD",
"CAPTION_NAME": "CUST_ID"},

{"DIMENSION_NAME": "SALES_AV_PRODUCTS_AD","CAPTION_NAME": "PRODUCTS"},

{"DIMENSION_NAME": "SALES_AV_PROMO_ID_AD","CAPTION_NAME": "PROMO_ID"},

{"DIMENSION_NAME": "SALES_AV_TIMES_AD","CAPTION_NAME":"TIMES"}]

Get Error class from Fact table

ords.Analytics.getErrorClassesFromFactTab('SALES_AV', 'SALES')
  
Output:
{
  "OBJECT_NAME": "SALES",
  "ERROR_COUNT": 0,
  "id": null,
  "color": null,
  "errorInPercentage": null,
  "errorData": []
}

Get Error class from Dimension

ords.getAnalyticViewMetadata('SALES_AV')
  
Output:
{
  "hierarchies": [
    {
      "name": "SALES_AV_CUST_ID_HIER",
      "alias": "CUST_ID",
      "owner": "ADPTEST",
      "captionColumnName": "SALES_AV_CUST_ID_HIER",
      "orderColumnName": "SALES_AV_CUST_ID_HIER_HIER_ORDER",
      "caption": "CUST_ID",
      "levels": [
        {
          "keys": [
            {
              "name": "CUST_ID_ATTR",
              "columnName": "SALES_AV_CUST_ID_HIER_CUST_ID_ATTR"
            }
          ],
          "altKeys": [],
          "attributes": [],
          "ordering": [
            {
              "name": "CUST_ID_ATTR",
              "direction": "ASC",
              "nulls": "FIRST",
              "columnName": "SALES_AV_CUST_ID_HIER_CUST_ID_ATTR"
            }
          ],
          "name": "CUST_ID",
          "caption": "CUST_ID"
        }
      ]
    },
    ...
    ],
    "measures": [
        {
            "name": "QUANTITY_SOLD",
            "columnName": "QUANTITY_SOLD",
            "dataType": "NUMBER",
            "measureType": "BASE",
            "caption": "QUANTITY_SOLD"
        },
        {
            "name": "AMOUNT_SOLD",
            "columnName": "AMOUNT_SOLD",
            "dataType": "NUMBER",
            "measureType": "BASE",
            "caption": "AMOUNT_SOLD"
        }
    ],
    "av": "SALES_AV",
    "owner": "ADPTEST"
}

Get the Analytic View Data

ords.Analytics.getAnalyticViewData(True,
['SALES_AV_CUST_ID_HIER_CUST_ID_ATTR', 'QUANTITY_SOLD', 'SALES_AV_CUST_ID_HIER_DEPTH'],
'SALES_AV', ['SALES_AV_CUST_ID_HIER'], ['QUANTITY_SOLD'],
[ { 'hierarchy': 'SALES_AV_CUST_ID_HIER', 'column': 'LEVEL_NAME', 'operator': '=', 'value: 'ALL:CUST_ID' } ])
  
Output:
  
[
  {
    "SALES_AV_CUST_ID_HIER_CUST_ID_ATTR": 100,
    "SALES_AV_CUST_ID_HIER_DEPTH": 1,
    "QUANTITY_SOLD": 5
  },
  {
    "SALES_AV_CUST_ID_HIER_CUST_ID_ATTR": 200,
    "SALES_AV_CUST_ID_HIER_DEPTH": 1,
    "QUANTITY_SOLD": 4
  },
...
]
Insight examples

You can use the following examples to generate an insight, drop the insight, list the insights their requests, check the completion of insights and get data from visualization view.

Generate of insight

ords.Insight.generateInsight('SALES_AV', 'AMOUNT_SOLD')
  
Output:
{
  "object_owner": "ADPTEST",
  "request_name": "REQUEST_INSIGHT_1",
  "max_insight_count": 20,
  "source_object": "SALES_AV",
  "source_owner": "SALES_AV",
  "request_metadata": "{\"targets\":[\"AMOUNT_SOLD\"],\"appName\":\"INSIGHTS\"}",
  "default_job_settings": "{\"insightTypes\":[\"FITTED_SHARE_COMPARISON\"]}",
  "new_request": "true"
}

Drop the insight

ords.Insight.dropInsight('REQUEST_INSIGHT_1')
Output:

List of insight requests

ords.Insight.insightRequestList()
  
Output:
[
  ...
  {
    "owner": "ADMIN",
    "request_name": "REQUEST_INSIGHT_1",
    "request_type": "TABLE_INSIGHT",
    "source_object": "SALES",
    "source_owner": "ADMIN",
    "request_metadata": "{\"targets\":[\"ACTUAL\"],\"appName\":\"INSIGHTS\"}",
    "created": "2022-10-31T08:03:26.634Z",
    "updated": "2022-10-31T08:03:26.634Z",
    "request_job_setting": "{\"EXTRACTION\":{\"selectionRule\":{\"value\":\"MAX_DIFFERENCE\"},\"fittedShareConvergence\":{\"value\":\"LAX\"},\"maxNumExtractionDims\":{\"value\":10000},\"insightValueType\":{\"value\":\"MEMBER\"},\"excludeZero\":{\"value\":\"YES\"},\"minXAxisCount\":{\"value\":2}},\"sessionId\":null,\"applicationName\":\"INSIGHTS\",\"insightTypes\":[\"FITTED_SHARE_COMPARISON\"]}",
    "max_num_insights": 20
  },
  ...
]

List of insights

ords.Insight.getInsightsList('REQUEST_INSIGHT_1')
  
Output:
[
  {
    "insight_name": "INSIGHT_62",
    "visualization_id": 199,
    "insight_column": "ACTUAL",
    "insight_value": "COUNTRY.2",
    "insight_dimension": "COUNTRY",
    "dimension": "FORECST"
  },..
]

Check the completion of insight

ords.Insight.getInsightsList('REQUEST_INSIGHT_1')
  
Output:
[
  {
    "insight_name": "INSIGHT_62",
    "visualization_id": 199,
    "insight_column": "ACTUAL",
    "insight_value": "COUNTRY.2",
    "insight_dimension": "COUNTRY",
    "dimension": "FORECST"
  },..
]

Get Data from the visualization view

ords.Insight.getGraphDetails("INSIGHT_62", 199, 5, True)
  
Output:
{
  "items": {
    "description": "COUNTRY.2",
    "XAXIS": "FORECST",
    "measure": "ACTUAL",
    "insight_type_label": "Expected",
    "visualization_condition": "COUNTRY.2",
    "source_object": "C0"
  },
  "query": {
    "labels": [
      "01:[66.7-1590]",
      "02:[829-2350]",
      "04:[2350-3870]",
      "06:[3870-5390]",
      "07:[4630-6150]"
    ],
    "actuals": [
      2200,
      6200,
      24600,
      26600,
      14200
    ],
    "estimates": [
      2671.105013588167,
      4830.820533674098,
      7135.506976989193,
      24772.963909661587,
      22121.48462234561
    ]
  }
}

Data Load Examples

This section provides the following examples to help you get started using the Data Load API:

  • Create database tables from local files
  • Create database tables from Database Link tables
  • Create database view from Database Link tables
  • Create Cloud Storage Link
  • Create database tables from cloud objects in Cloud Storage Link
  • Create database external tables from cloud objects in Cloud Storage Link

Create database tables from local files

ords.Ingest.getDatabaseLinks()
  
output:
{
  "nodes": [
    {
      "label": "**.ORACLE.COM",
      "type": "DB_LINK",
      "id": "\"ADPTEST\".\"DB_LINK\".\"**.ORACLE.COM\"",
      "data": {
        "name": "**.**.ORACLE.COM",
        "namespace": "DB_LINK",
        "path": "\"DB_LINK\".\"**.ORACLE.COM\"",
        "schema": "ADPTEST",
        "application": "DATABASE",
        "created": "2022-08-14T07:58:57Z",
        "updated": "2022-08-14T07:58:57Z"
      }
    }
  ],
  "links": []
}

Create database tables from Database Link tables

ords.Ingest.getDatabaseLinks()
  
output:
{
  "nodes": [
    {
      "label": "**.ORACLE.COM",
      "type": "DB_LINK",
      "id": "\"ADPTEST\".\"DB_LINK\".\"PHOENIX119757_ORDS_SH.REGRESS.RDBMS.DEV.US.ORACLE.COM\"",
      "data": {
        "name": "**US.ORACLE.COM",
        "namespace": "DB_LINK",
        "path": "\"DB_LINK\".\"**.US.ORACLE.COM\"",
        "schema": "ADPTEST",
        "application": "DATABASE",
        "created": "2022-08-14T07:58:57Z",
        "updated": "2022-08-14T07:58:57Z"
      }
    }
  ],
  "links": []
}

Get list of tables in the Database Link

ords.Ingest.getDBLinkOwnerTables('**US.ORACLE.COM')
  
output:
  
{
  "CAL_MONTH_SALES_MV": {
    "dbLink": "**.ORACLE.COM",
    "owner": "SH",
    "tableName": "CAL_MONTH_SALES_MV",
    "numRows": 35,
    "avgRowLen": 23
  },
  "CHANNELS": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "CHANNELS",
    "numRows": 5,
    "avgRowLen": 20
  },
  "COSTS": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "COSTS",
    "numRows": 960,
    "avgRowLen": 19
  },
  "COUNTRIES": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "COUNTRIES",
    "numRows": 19,
    "avgRowLen": 33
  },
  "CUSTOMERS": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "CUSTOMERS",
    "numRows": 630,
    "avgRowLen": 139
  },
  "DIMENSION_EXCEPTIONS": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "DIMENSION_EXCEPTIONS",
    "numRows": 0,
    "avgRowLen": 0
  },
  "FWEEK_PSCAT_SALES_MV": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "FWEEK_PSCAT_SALES_MV",
    "numRows": 934,
    "avgRowLen": 70
  },
  "MV_CAPABILITIES_TABLE": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "MV_CAPABILITIES_TABLE",
    "numRows": 0,
    "avgRowLen": 0
  },
  "PLAN_TABLE": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "PLAN_TABLE",
    "numRows": 0,
    "avgRowLen": 0
  },
  "PRODUCTS": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "PRODUCTS",
    "numRows": 766,
    "avgRowLen": 243
  },
  "PROMOTIONS": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "PROMOTIONS",
    "numRows": 501,
    "avgRowLen": 68
  },
  "REWRITE_TABLE": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "REWRITE_TABLE",
    "numRows": 0,
    "avgRowLen": 0
  },
  "SALES": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "SALES",
    "numRows": 960,
    "avgRowLen": 30
  },
  "TIMES": {
    "dbLink": "**US.ORACLE.COM",
    "owner": "SH",
    "tableName": "TIMES",
    "numRows": 1461,
    "avgRowLen": 178
  }
}
Create tables from the Database Link tables
 
 
ords.Ingest.copyDatabaseTablesFromDbLink([{'owner':'SH', 'tableName':'COUNTRIES', 'dbLink':'**US.ORACLE.COM'},
        {'owner':'SH', 'tableName':'CHANNELS', 'dbLink':'**US.ORACLE.COM'}])
  
output:
[
  {
     "owner":"SH",
     "tableName":"COUNTRIES",
     "dbLink":"**US.ORACLE.COM",
     "targetTableName":"COUNTRIES",
     "rowsCopied:: 99
  },
  {
     "owner":"SH",
     "tableName":"CHANNELS",
     "dbLink":"**US.ORACLE.COM",
     "targetTableName":"CHANNELS",
     "rowsCopied": 9
  }
]

Create database view from Database Link tables

ords.Ingest.linkDatabaseTablesFromDbLink([{'owner':'SH', 'tableName':'COUNTRIES', 'dbLink':'**US.ORACLE.COM'},
        {'owner':'SH', 'tableName':'CHANNELS', 'dbLink':'PHOENIX119757_ORDS_SH.REGRESS.RDBMS.DEV.US.ORACLE.COM'}])
  
output:
['COUNTRIES', 'CHANNELS']

Create Cloud Storage Link

ords.Ingest.createCloudStorageLink('ILIA', 'https://**.oraclecloud.com/n/idpbzw1f44my/b/test-bucket/o/', '**', 'oracleidentitycloudservice/xxxx.xxxxxx@oracle.com', '_Ai90CAqjzCs;XM2wgZ(')

Create database tables from cloud objects in Cloud Storage Link

ords.Ingest.getCloudBuckets()
  
output:
{
  "nodes": [
    {
      "label": "ILIA",
      "type": "CLOUD_STORAGE_LINK",
      "id": "\"ADPTEST\".\"STORAGE_LINK\".\"ILIA\"",
      "data": {
        "name": "ILIA",
        "entityID": 196,
        "namespace": "STORAGE_LINK",
        "path": "\"STORAGE_LINK\".\"ILIA\"",
        "schema": "ADPTEST",
        "application": "CLOUD",
        "created": "2022-08-15T06:57:01Z",
        "updated": "2022-08-15T06:57:01Z"
      }
    }
  ],
  "links": []
}
 
Create database tables from cloud objects in Cloud Storage Link.
ords.copyCloudObjects([{'storageLink':'ILIA', 'objectName':'users/weblogic/testData.csv', 'targetTableName':'TESTDATA'},
        {'storageLink':'ILIA', 'objectName':'users/weblogic/data.json'}])
  
output:
[
  {
    ...
    "tableOwner": "ADPTEST",
    "debugLevel": null,
    "tableName": "TESTDATA",
    "dropTableNames": [],
    "requestId": 214,
    "directiveId": 1,
    "credentialName": "ILIA",
    ...
  },
  {
    ...
    "tableOwner": "ADPTEST",
    "tableName": "DATA",
    "dropTableNames": [],
    "requestId": 214,
    ...
  }
]
Check the status of the Ingest job.
ords.getJobStatus('214')
  
output:
  
[
  {
    "schema": "ADPTEST",
    "request_id": 214,
    "table_name": "TESTDATA",
    "time_start": "2022-08-15T13:42:38.072476000+00:00",
    "last_event": "DIRECTIVE_END",
    "status": "complete",
    "time_completion": "2022-08-15T13:42:41.988662000+00:00",
    "rows_rejected": 9,
    "rows_total": 588,
    "rows_loaded": 579,
    "loader_rejected": null
  },
  {
    "schema": "ADPTEST",
    "request_id": 214,
    "table_name": "DATA",
    "time_start": "2022-08-15T13:42:38.223560000+00:00",
    "last_event": "DIRECTIVE_END",
    "status": "complete",
    "time_completion": "2022-08-15T13:42:45.053346000+00:00",
    "rows_rejected": 0,
    "rows_total": 4,
    "rows_loaded": 4,
    "loader_rejected": null
  }
]
Get the list of objects in the Cloud Storage Link
ords.Ingest.getCloudObjects('ILIA')
  
output:
{
  "nodes": [
     {
      "label": "users/weblogic/data.json",
      "type": "CLOUD_OBJECT",
      "id": "\"ADPTEST\".\"STORAGE_LINK\".\"ILIA\".\"OBJECT\".\"users/weblogic/data.json\"",
      "data": {
        "name": "users/weblogic/data.json",
        "namespace": "OBJECT",
        "path": "\"STORAGE_LINK\".\"ILIA\".\"OBJECT\".\"users/weblogic/data.json\"",
        "schema": "ADPTEST",
        "annotation": {
          "bytes": 772,
          "checksum": "d3fae7a10773bd74041847d52a89e088",
          "isFolder": false,
          "fileName": "data.json",
          "uri": "https://**.oraclecloud.com/n/idpbzw1f44my/b/test-bucket/o/users/weblogic/data.json"
        },
        "application": "CLOUD",
        "created": "2022-08-02T11:26:38Z",
        "updated": "2022-08-02T11:26:38Z"
      }
     },
     {
      "label": "users/weblogic/testData.csv",
      "type": "CLOUD_OBJECT",
      "id": "\"ADPTEST\".\"STORAGE_LINK\".\"ILIA\".\"OBJECT\".\"users/weblogic/testData.csv\"",
      "data": {
        "name": "users/weblogic/testData.csv",
        "namespace": "OBJECT",
        "path": "\"STORAGE_LINK\".\"ILIA\".\"OBJECT\".\"users/weblogic/testData.csv\"",
        "schema": "ADPTEST",
        "annotation": {
          "bytes": 50920,
          "checksum": "0b6a9a339d17f7509ba24c904fbb3fb6-1",
          "isFolder": false,
          "fileName": "testData.csv",
          "uri": "https://**.oraclecloud.com/n/idpbzw1f44my/b/test-bucket/o/users/weblogic/testData.csv"
        },
        "application": "CLOUD",
        "created": "2022-04-07T22:35:24Z",
        "updated": "2022-04-07T22:35:24Z"
      }
     },
     {
      "label": "users/weblogic/testData.txt",
      "type": "CLOUD_OBJECT",
      "id": "\"ADPTEST\".\"STORAGE_LINK\".\"ILIA\".\"OBJECT\".\"users/weblogic/testData.txt\"",
      "data": {
        "name": "users/weblogic/testData.txt",
        "namespace": "OBJECT",
        "path": "\"STORAGE_LINK\".\"ILIA\".\"OBJECT\".\"users/weblogic/testData.txt\"",
        "schema": "ADPTEST",
        "annotation": {
          "bytes": 58577,
          "checksum": "7f8c40956fe0a025f2e257771c6d52e5-1",
          "isFolder": false,
          "fileName": "testData.txt",
          "uri": "https://**.com/n/idpbzw1f44my/b/test-bucket/o/users/weblogic/testData.txt"
        },
        "application": "CLOUD",
        "created": "2022-04-07T22:35:38Z",
        "updated": "2022-04-07T22:35:38Z"
      }
     },
    ...
    ],
  "links": []
}

Create database tables from cloud objects in Cloud Storage Link

ords.Ingest.copyCloudObjects([{'storageLink':'ILIA', 'objectName':'users/weblogic/testData.csv', 'targetTableName':'TESTDATA'},
        {'storageLink':'ILIA', 'objectName':'users/weblogic/data.json'}])
  
output:
[
   ...
  {
    "tableOwner": "ADPTEST",
    "debugLevel": null,
    "tableName": "TESTDATA",
    "dropTableNames": [],
    "requestId": 214,
    "directiveId": 1,
    "credentialName": "ILIA",
    ...
  },
  {
    ...
    "tableOwner": "ADPTEST",
    "tableName": "DATA",
    "dropTableNames": [],
    "requestId": 214,
    ...
  }
]
Check the status of the ingest job
ords.Ingest.getJobStatus('214')
  
output:
  
[
  {
    "schema": "ADPTEST",
    "request_id": 214,
    "table_name": "DATA",
    "time_start": "2022-08-15T13:42:38.223560000+00:00",
    "last_event": "DIRECTIVE_END",
    "status": "complete",
    "time_completion": "2022-08-15T13:42:45.053346000+00:00",
    "rows_rejected": 0,
    "rows_total": 4,
    "rows_loaded": 4,
    "loader_rejected": null
  }
  ...
]

Create database external tables from cloud objects in Cloud Storage Link

Call linkCloudObjects function.

ords.Ingest.linkCloudObjects([{'storageLink':'ILIA', 'objectName':'users/weblogic/testData.csv', 'targetTableName':'TESTDATA'},
        {'storageLink':'ILIA', 'objectName':'users/weblogic/data.json'}])
  
output:
[
  {
    ...
    "tableOwner": "ADPTEST",
    "debugLevel": null,
    "tableName": "TESTDATA",
    "dropTableNames": [],
    "requestId": 215,
    "directiveId": 1,
    "credentialName": "ILIA",
    ...
  },
  {
    ...
    "tableOwner": "ADPTEST",
    "tableName": "DATA",
    "dropTableNames": [],
    "requestId": 215,
    ...
  }
]

Check the status of the Ingest job.

ords.Ingest.getJobStatus('214')
  
output:
[
  {
    "schema": "ADPTEST",
    "request_id": 215,
    "table_name": "TESTDATA",
    "time_start": "2022-08-15T13:42:38.072476000+00:00",
    "last_event": "DIRECTIVE_END",
    "status": "complete",
    "time_completion": "2022-08-15T13:42:41.988662000+00:00",
    "rows_rejected": 9,
    "rows_total": 0,
    "rows_loaded": 0,
    "loader_rejected": null
  },
  {
    "schema": "ADPTEST",
    "request_id": 215,
    "table_name": "DATA",
    "time_start": "2022-08-15T13:42:38.223560000+00:00",
    "last_event": "DIRECTIVE_END",
    "status": "complete",
    "time_completion": "2022-08-15T13:42:45.053346000+00:00",
    "rows_rejected": 0,
    "rows_total": 0,
    "rows_loaded": 0,
    "loader_rejected": null
  }
]