5 Manage Database Connections for Modeling Data

Administrators create and manage cloud database connections for modelling relational and non-relational data such as Essbase, Snowflake, or Oracle Enterprise Performance Management (Oracle EPM) data. Your business data doesn't have to be in one place. Connect to multiple cloud databases so that business modelers and analysts can analyze company data wherever it's stored.

Model Data in an Essbase Cube

Connect to an Essbase database so that you can model and visualize data from Essbase cubes.

Note: You can only model Essbase data in Model Administration Tool.
Before you start, deploy Data Gateway with your Oracle Analytics instance and configure a Data Gateway agent to communicate with the remote Essbase deployment.
  1. In the Physical Layer, create a new database:
    1. Right-click in the Physical Layer pane and select Create New Database.
    2. At the Database dialog, specify a name to identify your database within Oracle Analytics.
    3. Select the Database Type (for example, Essbase 11), then click OK.
  2. In the Physical Layer, create a new connection pool:
    1. Right-click the new database, click Create New Object, then select Database Connection Pool.
    2. At the Connection Pool dialog, specify a name to identify the database pool within Oracle Analytics.
    3. In Essbase Server, specify the connection string for your Essbase server.
      For example, http://<IP address>:<port number>/essbase/agent.
    4. Select Use Data Gateway.
    5. Enter the User name and Password for the Essbase deployment.
    6. If prompted, re-enter the password for the Essbase deployment.
  3. In the Physical Layer, import the Essbase metadata:
    1. Right-click the Essbase connection and select Import Metadata.
    2. At the Select Data Source page, click Next.
    3. At the Select Metadata Objects page, expand the database in the Data source box, select the cube you want, then click Import selected.

      For a large cube, the import can take two to three minutes.

    4. When the import is complete, expand the database in the Repository View to display the imported Essbase cube.
    5. Click Finish.
  4. Using the physical layer that you've just created, create your business model and mapping layer, and your presentation layer.
  5. Click File, Cloud, then Publish.
  6. Create a dashboard or visualization workbook based on your Essbase cube.
    The new Essbase cube is now available as a subject area in Oracle Analytics.
    For example, in Oracle Analytics, create an analysis and in the Select Subject Area dialog you can access the new Essbase subject area. In Oracle Analytics, create a new workbook and in the Add Dataset dialog, click Subject Areas to access the new Essbase subject area.

Model Data in Snowflake Data Warehouse

Configure your on-premises environment so that you can model data in a Snowflake database.

Note: You can only model Snowflake data in Semantic Modeler or Model Administration Tool. This task describes the process using Model Administration Tool.
Both local and remote connections to Snowflake from a semantic model require an installation of Data Gateway in order for the Model Administration Tool to import and model tables from a Snowflake data source. This Data Gateway agent needs to be configured with a suitable driver to connect to Snowflake. For remote connections, Data Gateway must continue to be available when queries are run. However, once the Snowflake tables are modeled and the semantic model is published to Oracle Analytics, for local connections, Data Gateway can be disabled or removed as it isn't used when queries are run from Oracle Analytics.

You can create either a local or a remote connection to Snowflake from the semantic model. When creating a local (not remote) connection, the connection pool in the semantic model uses a JDBC connection.

Before you start, install Data Gateway and Model Administration Tool on the same Windows computer in your on-premises environment.
  1. Configure a local Data Gateway agent to facilitate a connection from the Developer Client Tool to Snowflake.
    1. Download the latest Snowflake JDBC driver (for example, in file snowflake-jdbc-3.9.0.jar).
    2. Copy the downloaded JAR file to the Data Gateway installation folder.
      In a server deployment, copy the JAR file into:
      <Data Gateway folder>/domain/jettybase/lib/ext
      In a personal deployment, copy the JAR file into:
      <install directory>\war\datagateway\WEB-INF\lib
    3. Re-start Data Gateway.
  2. Configure a connection to your Snowflake database.
You can now model your data using this connection.

Create a Local Semantic Model Connection to Snowflake

Connect to a local Snowflake database so that you can model Snowflake data.

  1. In Model Administration Tool, enable the JDBC connection pool functionality by loading Java data sources. See Step 3 in Configure and Register Data Gateway for Reporting.
  2. In Model Administration Tool, create a new database and set the type to Snowflake.
  3. Add a Connection Pools and specify these details on the General tab:
    • Call Interface: JDBC(Direct Driver).
    • Require fully qualified table names: Yes.
    • Data source name: Enter the connection string, for example: jdbc:snowflake://xxxx.snowflakecomputing.com?db=ODEV&warehouse=xxxxxx&schema=xxxxxx
    • RDC Version: Leave this field blank.
  4. On the Miscellaneous tab, specify these details:
    • JDS Server URL: Leave this field blank (remove any entry in this field).
    • Driver Class: net.snowflake.client.jdbc.SnowflakeDriver.
    • Use SQL over HTTP: false.
  5. Model your data using this connection.
  6. Upload or publish your semantic model to Oracle Analytics Cloud when you have completed your model.
    Note: Oracle Analytics Cloud connects to the Snowflake database without using Data Gateway.
You can now model your data using this connection.

Create a Remote Semantic Model Connection to Snowflake

Connect to a local Snowflake database so that you can model Snowflake data.

  1. In Model Administration Tool, enable the JDBC connection pool functionality by loading Java data sources. See Step 3 in Configure and Register Data Gateway for Reporting.
  2. In Model Administration Tool, create a new database and set the type to Snowflake.
  3. Add a Connection Pools and specify these details on the General tab:
    • Call Interface: JDBC(Direct Driver).
    • Require fully qualified table names: Yes.
    • Data source name: Enter the connection string, for example: jdbc:snowflake://xxxx.snowflakecomputing.com?db=ODEV&warehouse=xxxxxx&schema=xxxxxx
    • RDC Version: Set to 2.
  4. On the Miscellaneous tab, specify these details:
    • JDS Server URL: Leave this field blank (remove any entry in this field).
    • Driver Class: net.snowflake.client.jdbc.SnowflakeDriver.
    • Use SQL over HTTP: true.
  5. Model your data using this connection.
  6. Upload or publish the semantic model to Oracle Analytics Cloud when you have completed your model.
    Note: Oracle Analytics Cloud connects to Snowflake using any configured Data Gateway agent.
  7. Copy the Snowflake driver file to each Data Gateway agent installation folder.
    • In a server deployment, copy the JAR file into: <Data Gateway install_location>/domain/jettybase/thirdpartyDrivers.
    • In a personal deployment on Windows, copy the JAR file into: <Data Gateway_extract_path>\thirdpartyDrivers.
    • In a personal deployment on MacOS, copy the JAR file into: <Application->Show Package Contents>Resources->app.nw-> thirdpartyDrivers.
  8. Re-start Data Gateway. See Maintain Data Gateway.

Model Data in Google BigQuery

Connect to a Google BigQuery database so that you can model and visualize data from Google BigQuery.

Topics

Note: You can only model Google BigQuery data in Semantic Modeler or Model Administration Tool. These tasks describes the process using Model Administration Tool.

Create an Oracle Analytics Connection to Google BigQuery

You can create a system connection to a Google BigQuery database and use the connection to model the Google BigQuery project.

Before you start, download the Service Account Private Key (in JSON format) for your Google BigQuery service.
  1. On the Oracle Analytics Home page, click Create, and then click Connection.
  2. Click BigQuery.
  3. Enter the connection details.
    • In Connection Name, specify a user-friendly name to identify the connection details in Oracle Analytics.
    • In Project, specify the BigQuery project name in lower-case of the project that you want to analyze.
    • In Service Account Private Key, click Select and upload the Service Account Private Key (JSON format) for your BigQuery service. The Service Account Email is populated from the uploaded key details.
    • In System connection, select this option.

  4. Save the details.

Download and Set Up BigQuery ODBC Driver

Install the ODBC driver required for connection to Google BigQuery, and configure it in Model Administration Tool to enable you to model the project.

  1. Download the Simba BigQuery ODBC driver from Google.
    For example, download it from the Google reference site.
  2. Install the download driver on the machine where Oracle Analytics Client Tools is installed.
  3. Configure the ODBC driver using the DSN Setup dialog.

  4. Click Test the test the connection.

  5. Save the details.

Build a Data Model from Google BigQuery Data Source

You build a data model for your Google BigQuery database so that you can deploy it to visualize data in a BigQuery project.

Before you start, note the following:
To build a data model, you need permissions in the BigQuery key. If the BigQuery key grants access to the dataset level, simply perform Import Metadata using the BigQuery ODBC driver by following the steps below. If the BigQuery key grants access to only specific tables or views, follow the steps below to create a physical schema.
  1. In Model Administration tool, create a database in the repository and set the Database type to ODBC Basic.

  2. In the Connection Pools dialog, create a connection pool in the database.
    • In Call interface, select "Default (ODBC 2.0)".
    • In Data source name field, select the BigQuery ODBC driver that you created earlier.

  3. Create a physical schema in the database using the same name as the BigQuery dataset.
    BigQuery SQL requires that the dataset name prepend the table name, dataset.table. The dataset name is equivalent to a physical schema object in the repository file.
  4. Right-click the connection pool and select Import Metadata.
  5. On the Select Data Source dialog, select either ODBC 2.0 or ODBC 3.5 for the connection type, and select the BigQuery ODBC driver.

  6. On the Select Metadata Types dialog, select Views and any other types you want to use for which your BigQuery key has permissions.

  7. On the Select Metadata Types dialog, select the individual tables and then click Import Selected. This imports the BigQuery database and the underlying structures. Tip: If you click Import All, you import only the database. If this happens, select Import All a second time to import the tables.

  8. Click Finish.
  9. Drag imported tables into physical schema.
  10. Edit the physical database and change the database type to BigQuery.

    When changing the physical database, you see a message that states that the database type doesn't match the call interface set in the connection pool. Click Yes.

  11. In the Connection Pool dialog, configure these settings:
    • In Call interface, change the call interface to JDBC (Direct Driver).
    • Select Require fully qualified table names.
    • Select Use Data Connection.
    • In Oracle Analytics, inspect the BigQuery connection and copy the Object ID. BigQuery is case-sensitive. To ensure that the data connection syntax is correct, use the Copy button.

    • In the Connection Pool dialog, paste the copied Object ID into the Object ID field.
    • Set Maximum connections to 100.
  12. Save the details.
Model the metadata in the repository and upload the repository file (RPD) to Oracle Analytics.

Troubleshoot repository connection issues for Google BigQuery

Here're some issues that you might encounter when connecting to Google BigQuery and work-arounds for them.

If 'Require fully qualified table names' isn't selected and a physical schema isn't part of the generated SQL, then queries fail with a message similar to 'Failed to read data from Java Datasource server'.

If the query is run against BigQuery using nqcmd or another SQL entry tool, the actual error message is displayed:

WITH SAWITH0 AS (select distinct T4.PROP_CD as c1 from FINOPS_RM_OCC_ACT T4) select 0 as c1, D1.c1 as c2 from SAWITH0 D1 order by c2
[Simba][BigQuery] (70) Invalid query: Table "FINOPS_RM_OCC_ACT" must be qualified with a dataset (e.g. dataset.table).
Statement preparation failed

The way to qualify the query with a dataset is to use a physical schema in the repository file.

If the Oracle Analytics connection uses an uppercase project name, the connection is created successfully.

You might see two different problems.

1. Queries fail with a 404 Not Found message about a masked URL:

[2022-03-17T01:13:44.105+00:00] [OBIS] [TRACE:2] [USER-34] [] [ecid: d6382db0-1e63-427e-893b-18bc00c0424e-0000de96,0:2:1:5] [sik: bootstrap] [tid: 856a6700] [messageId: USER-34] [requestid: 6358001e] [sessionid: 63580000] [username: Testuser] -------------------- Query Status: [nQSError: 46164] HTTP Server returned 404 (Not Found) for URL [masked_url]. [[
[nQSError: 46281] Failed to download metadata for dataset ‘system’.‘BigQuery Test’.
[nQSError: 43119] Query Failed:

2. Within Oracle Analytics, you see datasets but the underlying tables aren't available.
Description of googlebq15.png follows
Description of the illustration googlebq15.png

In both cases, you can modify the connection so that the project name is lowercase.

When troubleshooting BigQuery connections in OAC, use a third-party JDBC client to try to connect to BigQuery using the same Service Account Key.

If the connection still fails, there is a problem with the Service Account Key.

If the connection is successful, there is a problem with Oracle Analytics and you should contact Oracle Support.

This test is helpful in cases where a .rpd is not used to connect to BigQuery and the Service Account Key is not verified through ODBC.

DSN Formats for Specifying Data Sources

In Oracle Analytics you can model your on-premises data for many database types. Oracle Analytics supports direct access to some on-premises data sources through the semantic model. When you create the database connection using Model Administration Tool, in the Data source name field on the Connection Pool dialog (General tab) you use the appropriate DSN format for the database type you're connecting to.

Amazon Redshift:
       DRIVER=Oracle 7.1 Amazon Redshift Wire Protocol;HOST=["host-name"];PORT=["port"];DB=["service-name"]
      SSL: DRIVER=Oracle 7.1 Amazon Redshift Wire Protocol;HOST=["host-name"];PORT=["port"];DB=["service-name"];EM=6;CPV=TLSv1.2,TLSv1.1,TLSv1, SSLv3, SSLv2;VSC=0
Apache Drill:
       DRIVER=MapR Drill ODBC Driver;Host=["host-name"];Port=["port"];CastAnyToVarchar=true;ExcludedSchemas=sys,INFORMATION_SCHEMA;AuthenticationType=Basic               Authentication;ConnectionType=Direct
Aster:
       DRIVER=Aster ODBC Driver;SERVER=["host-name"];PORT=["port"];DATABASE=["service-name"]
DB2:
      DRIVER=Oracle 7.1 DB2 Wire Protocol;IpAddress=["host-name"];PORT=["port"];DB=["service-name"]
      SSL:  DRIVER=Oracle 7.1 DB2 Wire Protocol;IpAddress=["host-name"];PORT=["port"];DB=["service-name"];EM=1;VSC=0
Greenplum:
       DRIVER=Oracle 7.1 Greenplum Wire Protocol;HOST=["host-name"];PORT=["port"];DB=["service-name"]
Hive:
       DRIVER=Oracle 8.0 Apache Hive Wire Protocol;HOST=["host-name"];PORT=["port"]
      SSL: DRIVER=Oracle 8.0 Apache Hive Wire Protocol;HOST=["host-name"];PORT=["port"];EM=1;VSC=0
Impala:
       DRIVER=Oracle 7.1 Impala Wire Protocol;HOST=["host-name"];PORT=["port"]
     SSL:  DRIVER=Oracle 7.1 Impala Wire Protocol;HOST=["host-name"];PORT=["port"];EM=1;VSC=0
Informix:
       DRIVER=Oracle 7.1 Informix Wire Protocol;HOSTNAME=["host-name"];PORTNUMBER=["port"];DATABASE=["service-name"]
MongoDB:
       DRIVER=Oracle 8.0 MongoDB;HOST=["host-name"];PORT=["port"];DB=["service-name"]
MySQL:
      DRIVER=Oracle 7.1 MySQL Wire Protocol;HOST=["host-name"];PORT=["port"];DB=["service-name"]
PostgresSql:
       DRIVER=Oracle 7.1 PostgreSQL Wire Protocol;HOST=["host-name"];PORT=["port"];DB=["service-name"]
Spark:
      DRIVER=Oracle 8.0 Apache Spark SQL;HOST=["host-name"];PORT=["port"]
      SSL:  DRIVER=Oracle 8.0 Apache Spark SQL;HOST=["host-name"];PORT=["port"];EM=1;VSC=0
SQL Server:
     DRIVER=Oracle 7.1 SQL Server Wire Protocol;HOST=["host-name"];PORT=["port"];DB=["service-name"]
     SSL:  DRIVER=Oracle 7.1 SQL Server Wire Protocol;HOST=["host-name"];PORT=["port"];DB=["service-name"];EM=1;VSC=0;CryptoProtocolVersion=TLSv1.2,TLSv1.1,TLSv1,SSLv3,SSLv2
Sybase:
       DRIVER=Oracle 7.1 Sybase Wire Protocol;NA=["host-name"], ["port"];DB=["service-name"]
Teradata:
       DRIVER=Oracle 7.1 Teradata;DBCName=["host-name"];port_name=["port"]