A Scripts for Administration Tasks
Oracle Analytics Cloud - Classic provides scripts to perform some common administration tasks. Always use the scripts provided. Don’t perform any administration tasks manually.
Topics:
To run the scripts, see Run Administration Scripts.
Understanding Customization and Administration
-
Supported customizations:
Any configuration or functionality change that you make exclusively using Oracle Analytics Cloud - Classic scripts available from:
/bi/app/public/bin
The changes you make with these scripts are tested and supported.
Oracle Analytics Cloud - Classic is based on an Oracle Linux image. If required, you can update the packages that are already installed using the
yum update
command. If you want to make any other custom changes to the image, you must log a service request with Oracle Support to check supportability.
-
Unsupported customizations: Any configuration or functionality change that you make to Oracle Analytics Cloud - Classic using any other script, installer, or API, using WebLogic Console or Enterprise Manager, or by directly editing files. Such customizations are equivalent to on-premises customizations, for example, customizations similar to those made to Oracle BI Enterprise Edition.
-
Custom changes aren’t covered by Oracle Support.
-
Unsupported customizations can prevent patching and other service lifecycle operations.
-
Administration Scripts for Data Visualization and Business Intelligence Services (BI Services)
Administrative Task | Script Name | More Information |
---|---|---|
Change the WebLogic administrator password for my BI service |
|
Change the WebLogic Administrator Password (BI Service Script) |
Import a batch of users or roles from a CSV file |
|
|
Edit or delete a batch of users or roles from a CSV file |
|
Update or Delete Users and Roles from Embedded LDAP (BI Service Script) |
Configure a public storage container for sharing content |
|
Create a Public Container for Sharing Content (BI Service Script) |
Update the password your BI service uses to access its database schemas. |
|
|
Export and import datasets |
|
|
Gather diagnostic logs related to my BI service into a ZIP file before contacting Oracle Support |
|
|
Find out the current status of my BI service |
|
|
Stop BI component processes running on my service |
|
|
Start up BI component processes on my service |
|
|
Register a SSL private key with my HTTP proxy |
|
Register SSL Private Keys with the HTTP Proxy for a Nonmetered Service (BI Service Script) |
Redirect all HTTP calls to HTTPS |
|
|
Enable a nonmetered service to store user group membership information in a database |
|
Enable Database Storage for User Group Memberships for a Nonmetered Service (BI Service Script) |
Administration Scripts for Essbase Services
Administrative Task | Script Name | More Information |
---|---|---|
Export Essbase applications and users from v17.3.x to latest update. |
|
|
Import Essbase applications and users from v17.3.x to latest update. |
|
|
Gather diagnostic logs related to my Essbase service into a ZIP file before contacting Oracle Support |
|
Gather Diagnostic Logs into a ZIP File (Essbase Service Script) |
Update the password your Essbase service uses to access its database schemas |
|
Run Administration Scripts
You must be an Oracle
Analytics Cloud - Classic administrator to run the administration scripts. To access a compute node associated with Oracle
Analytics Cloud - Classic, you use Secure Shell (SSH) client software to establish a secure connection and log in as the user oracle
. Several SSH clients are freely available. This topic describes how to run scripts using the ssh
utility, included with UNIX and UNIX-like platforms.
Before you start, you’ll need some connection information:
-
The IP address of the compute node
The IP address of a compute node associated with your Oracle Analytics Cloud - Classic service is displayed in Oracle Cloud Infrastructure Console (Overview page for the service). See View and Manage Services.
-
The SSH private key file that matches the public key associated with the service.
ssh
utility:
MYNEW$PASSWORD$
, you enter 'MYNEW$PASSWORD$'
.
Change the WebLogic Administrator Password (BI Service Script)
You set the administrator password for the WebLogic server when you set up your service. If you want to change the password you must always use the script update_wls_admin_password
.
Note:
If you change the administrator password your backups might not contain the latest password. If this is the case and you restore your service, the administrator password reverts to the older password in the backup.
Script Location
/bi/app/public/bin/update_wls_admin_password
To run the script, see Run Administration Scripts.
Syntax
update_wls_admin_password [-h] [LOGLEVEL] [LOGDIR] 'username' 'old_password' 'new_password'
Where:
username
Existing WebLogic server administrator username.
old_password
Existing WebLogic server administrator password.
new_password
New password for the WebLogic server administrator user.
-
-h
Shows help for the script and exits. -
LOGLEVEL
Sets the logging level for standard errors (stderr). The default is INFO. Options:DEBUG
INFO
WARNING
ERROR
CRITICAL
The logging level for messages in the log file is always DEBUG.
-
LOGDIR
Log directory. The default directory is:/var/log/bi
Example
update_wls_admin_password 'weblogic' 'oldpassword' 'newpassword'
Export Users and Roles to CSV Files (BI Service Script)
(Only valid for services using WebLogic embedded LDAP server). To export users and roles, use the script wls_ldap_csv_exporter
. This script creates two CSV files and a log file. One CSV file contains users and the other contains groups.
Script Location
/bi/app/public/bin/wls_ldap_csv_exporter
To run the script, see Run Administration Scripts.
Syntax
wls_ldap_csv_exporter -u weblogic_admin_user
-c oracle_common_folder_path
-D output_dir
[--loglevel LOGLEVEL] [--logdir LOGDIR]
Where:
-
u
Sets the administrator user name. -
c
Sets theoracle_common
folder path. Typically,
./bi/app/fmw/oracle_common
-
D
Specifies where to export the CSV files. -
LOGLEVEL
Sets the logging level for standard errors (stderr). The default is INFO. Options:DEBUG
INFO
WARNING
ERROR
CRITICAL
The logging level for messages in the log file is always DEBUG.
-
LOGDIR
Log directory. The default directory is:/var/log/bi
Examples
wls_ldap_csv_exporter -u weblogic_admin -c /bi/app/fmw/oracle_common -D /myfolder
Import Users and Roles from a CSV File (BI Service Script)
(Only valid for services using WebLogic embedded LDAP server). Rather than add users manually one at a time through the Console, you can add a batch of users from a file. To do this, create a CSV (comma-separated values) file that contains the user data in a fixed format. You can create multiple user roles with member assignments from CSV files too. To import users and roles this way, use the script import_users_groups_csv
.
It’s important that the CSV file is formatted correctly. Spaces are not allowed.
Import Type | Information Required in the CSV | Example |
---|---|---|
Users | User ID,Display Name,Password,givenname,lastname,mail |
|
Roles | Display Name,Description,User Members One or more User IDs separated by a semicolon. |
|
Script Location
/bi/app/public/bin/import_users_groups_csv
To run the script, see Run Administration Scripts.
Syntax
import_users_groups_csv [-h] --admin-user ADMIN_USER
--type {users,groups}
[--loglevel LOGLEVEL] [--logdir LOGDIR]
filename
Where:
filename
Name of the CSV file.
Optional parameters:
-
h
Shows Help for the script and exits. -
ADMIN_USER
Sets the administrator user name. -
users,groups
Specifies the type of CSV you want to import. -
LOGLEVEL
Sets the logging level for standard errors (stderr). The default is INFO. Options:DEBUG
INFO
WARNING
ERROR
CRITICAL
The logging level for messages in the log file is always DEBUG.
-
LOGDIR
Log directory. The default directory is:/var/log/bi
Examples
import_users_groups_csv weblogic_admin users allmyusers.csv
import_users_groups_csv weblogic_admin groups allmygroups.csv
Update or Delete Users and Roles from Embedded LDAP (BI Service Script)
(Only valid for services using WebLogic embedded LDAP server). Rather than updating or deleting users manually one at a time through the Console, you can update or delete a batch of users from a file. To do this, create a CSV (comma-separated values) file that contains the user data in a fixed format. You can update or delete multiple user roles from CSV files too. To modify users and roles this way, use the script update_users_groups
.
It’s important that the CSV file is formatted correctly. Spaces aren’t allowed.
Type | Information Required in the CSV | Example |
---|---|---|
Users | User ID,Display Name,Password,givenname,lastname,mail |
|
Roles | Display Name,Description,User Members One or more User IDs separated by a semicolon. |
|
Script Location
/bi/app/public/bin/update_users_groups
To run the script, see Run Administration Scripts.
Syntax
update_users_groups [-h] --admin-user ADMIN_USER
--action {update,delete}
--type {users,groups} [--loglevel LOGLEVEL] [--logdir LOGDIR]
filename
Where:
filename
Name of the CSV file.
Optional parameters:
-
h
Shows Help for the script and exits. -
ADMIN_USER
Sets the administrator user name. -
update,delete
Specifies whether you want to modify or delete users (or groups) in the CSV file. -
users,groups
Specifies whether the CSV file contains users or groups. -
LOGLEVEL
Sets the logging level for standard errors (stderr). The default is INFO. Options:DEBUG
INFO
WARNING
ERROR
CRITICAL
The logging level for messages in the log file is always DEBUG.
-
LOGDIR
Log directory. The default directory is:/var/log/bi
Example
update_users_groups delete removeoldusers.csv
Create a Public Container for Sharing Content (BI Service Script)
You can define a public storage container so that other users can share their content. Use the script configure_public_storage
to specify the storage container you want to use.
If you configured a public container when you set up your service, you override the container that you specified when you run this script.
Script Location
/bi/app/public/bin/configure_public_storage
To run the script, see Run Administration Scripts.
Syntax
configure_public_storage user pwd baseurl container [force]
Where:
user
- Name of a user with permission to create containers.
pwd
- Password for the storage user.
baseurl
- Base URL for the storage service. For example: https://storage.oraclecloud.com/v1
-
-
https://domain.storage.oraclecloud.com/v1
For example: https://example.storage.oraclecloud.com/v1
-
https://Storage-GUID.storage.oraclecloud.com/v1
For example: https://Storage-ab1c23de4456f78g9123456hi7k8j89.storage.oraclecloud.com/v1
-
container
- Name of the storage container you want to create, in the format: storage-identityDomainID/
containtername
Optional parameters:
force
- Override the current public container, if one is designated.
Example
configure_public_storage --user mystorageuser.Storageadmin --pwd secretpassword --baseURL https://storage.oraclecloud.com/v1 --container Storage-mystorageuser/My_Public_Container force
Export and Import Datasets (BI Service Script)
You can export and import datasets that users have uploaded to Oracle
Analytics Cloud - Classic. Use the script migrate_datafiles
to export all the datasets in cloud storage to an archive (.zip
file) and import them on another service.
Script Location
/bi/app/public/bin/migrate_datafiles
To run the script, see Run Administration Scripts.
Syntax
migrate_datafiles sikey archive action
Where:
sikey
- Service key. Always bootstrap
.
archive
- Full path to the archive you want to create or import. For example /tmp/mydatasets.zip
.
action
- Either export
or import
.
Example — Export
$ migrate_datafiles bootstrap /tmp/mydatasets.zip export
Enter encryption password for archive: ENTER_PASSWORD
Confirm encryption password for archive: ENTER_PASSWORD
$ chmod ugo+rw /tmp/dss.zip
Example — Import
If you haven’t done so already, copy the dataset archive you want to import the target service, for example mydatasets.zip
.
$ migrate_datafiles bootstrap /tmp/mydatasets.zip import
Enter encryption password for archive: ENTER_PASSWORD
Confirm encryption password for archive: ENTER_PASSWORD
Update Database Credentials (BI Service Script)
When you create a service with Oracle
Analytics Cloud - Classic, various schemas are created and loaded into an associated Oracle Database Classic
Cloud Service that you select. If the database administrator password for this Oracle Database Classic
Cloud Service changes or expires, you can use the reset_schema_password
script to update the password that your BI service uses to access its schemas.
-
Connect to the compute node for your service
-
Change to the script folder:
/bi/app/public/bin
-
Stop BI processes using the script
stop_analytics_suite
-
Update the schema password using the script
reset_schema_password
-
Restart BI processes, using the script
start_analytics_suite
Syntax
reset_schema_password [-h] [LOGLEVEL] [LOGDIR]
Where:
-
-h
Shows help for the script and exits. -
LOGLEVEL
Sets the logging level for standard errors (stderr). The default is INFO. Options:DEBUG
INFO
WARNING
ERROR
CRITICAL
The logging level for messages in the log file is always DEBUG.
-
LOGDIR
Log directory. The default directory is:/var/log/bi
Example
To update the schema password, run all three scripts from /bi/app/public/bin
in this order:
> stop_analytics_suite
> reset_schema_password
> start_analytics_suite
Gather Diagnostic Logs into a ZIP File (BI Service Script)
If you're troubleshooting an issue with your service or you need to contact Oracle Support, you can easily collect all available log files into one place. Use the script collect_diagnostic_logs
to collect diagnostic data into a ZIP file.
Script Location
/bi/app/public/bin/collect_diagnostic_logs
To run the script, see Run Administration Scripts.
Syntax
collect_diagnostic_logs [-h] [--loglevel LOGLEVEL] [--logdir LOGDIR] filename
Where:
filename
Name of the ZIP file you want to generate.
Optional parameters:
-
h
Shows help for the script and exits. -
LOGLEVEL
Sets the logging level for standard errors (stderr). The default is INFO. Options:DEBUG
INFO
WARNING
ERROR
CRITICAL
The logging level for messages in the log file is always DEBUG.
-
LOGDIR
Log directory. The default directory is:/var/log/bi
Example
collect_diagnostic_logs DiagnosticsForMyService.zip
Get Status Information (BI Service Script)
You can find out the status of your service at any time. Use the script status
to report whether WebLogic Server and various other BI processes are up and running.
If this script can't respond for some reason, restart the service and try again. See Stop and Start Component Processes (BI Service Script).
Syntax
status [-v]
Where:
v
Indicates verbose.
Example
>status
/Servers/AdminServer/ListenPort=7001
Accessing admin server using URL t3://xxx:7001
Status of Domain: /bi/domain/fmw/user_projects/domains/bi
NodeManager xxx:9556): RUNNING
Name Type Machine Restart Int Max Restart Status
---- ---- ------- ----------- ----------- ------
AdminServer Server xxx unknown unknown RUNNING
bi_server1 Server m1 unknown unknown RUNNING
obiccs1 OBICCS m1 3600 2 RUNNING
obisch1 OBISCH m1 3600 2 RUNNING
obips1 OBIPS m1 3600 2 RUNNING
obijh1 OBIJH m1 3600 2 RUNNING
obis1 OBIS m1 3600 2 RUNNING
Stop and Start Component Processes (BI Service Script)
If you’re having issues, you can restart the BI components running on your service rather than the entire service. Restarting BI components is often quicker than restarting the service. When you stop BI processes, anyone who is signed in, is signed out. When you restart, users are prompted to sign in again. Use scripts stop_analytics_suite
and start_analytics_suite
to stop and start BI components.
Script Location
/bi/app/public/bin/stop_analytics_suite
/bi/app/public/bin/start_analytics_suite
To run the scripts, see Run Administration Scripts.
Syntax — stopPod.py
stop_analytics_suite [-h] [--loglevel LOGLEVEL] [--logdir LOGDIR]
start_analytics_suite [-h] [--loglevel LOGLEVEL] [--logdir LOGDIR]
Optional parameters:
-
h
Shows help for the script and exits. -
LOGLEVEL
Sets the logging level for standard errors (stderr). The default is INFO. Options:DEBUG
INFO
WARNING
ERROR
CRITICAL
The logging level for messages in the log file is always DEBUG.
-
LOGDIR
Log directory. The default directory is:/var/log/bi
Examples
stop_analytics_suite
start_analytics_suite
Register SSL Private Keys with the HTTP Proxy for a Nonmetered Service (BI Service Script)
If you have a nonmetered subscription for Oracle Analytics Cloud - Classic, you can register your custom SSL certificates to secure HTTPS access to your service.
Note:
These instructions don't apply if you subscribe to Oracle Analytics Cloud - Classic through Universal Credits and your Oracle Analytics Cloud - Classic deployment uses Oracle Identity Cloud Service with a load balancer (Oracle Cloud Infrastructure Load Balancing Classic). If you have a load balancer enabled environment, and want to use a custom certificate instead of the ones provided by Oracle, then you need to set up your custom SSL certificates in Oracle Cloud Infrastructure Load Balancing Classic. See Importing a Load Balancer Digital Certificate and About the Load Balancer IP Addresses and Canonical Host Name. Use the script proxy_register_ssl_private_key.py
, to register your private key and your Certificate Authority (CA) signed certificate.
When the service is created, a self-signed certificate is generated. The self-signed certificate is intended to be temporary and you must replace it with a new private key and a certificate signed by a CA which your browsers are configured to trust (that is, a commercial CA built into the browser by the browser vendor). The temporary certificate expires after one year.
Before You Run the Script
- Verify that your private key and SSL certificate files contain the required information.
-
Private key and CA signed certificate must use the DNS registered name as the common name (CN).
-
CA signed certificate must also include the CN as the first
Subject Alternative Name
-
-
Private key and CA signed certificate files must be in PEM format.
-
Private key must not be protected with passphrase.
-
Private key permissions must be set to read-only and owned by the
oracle
user.
To test any changes you make to certificates and certificate chains on Windows, you might need to clear your SSL state. From the Control Panel menu, select Internet Options, then Content, then Clear SSL State.
-
-
If your service uses a DNS registered host name, specify the host name that you want to secure with SSL in
servername.conf
:Note:
Each service has a public IP address available on the internet. You can register your own FQDN (Fully Qualified Domain Name) against this public IP address so your service appears in your organization's internet domain. The FQDN must match the CN in the certificate. The FQDN must also be present asSubject Alternative Name
in the certificate.-
Create a file named
servername.conf
at this location:/bi/data/httpd/conf.d/servername.conf
-
Set permissions on the file as, owned by the
oracle
user and readable by everyone. -
In
servername.conf
, add a single line:ServerName <DNS name that matches your SSL certificate>
For example:
ServerName analytics.myexample.com
-
Script Location
/bi/app/public/bin/proxy_register_ssl_private_key.py
To run the script, see Run Administration Scripts.
Syntax
proxy_register_ssl_private_key [-h] --serverName SERVERNAME --privatekeyPath PRIVATEKEYPATH
--sslCertificatePath SSLCERTIFICATEPATH [--sslIntermediateCertificatePath
SSLINTERMEDIATECERTIFICATEPATH]
Where:
serverName
is the DNS registered host name. For example: ServerName analytics.myexample.com
privatekeypath
is the name and location of the file containing your private key. For example: /temp/myprivate.key
sslcertificatepath
is the name and location of the SSL certificate. For example: /temp/mycertfile.crt
sslIntermediateCertificatePath
is the name and location of the intermediate SSL certificate (if it exists)
[-h] [LOGLEVEL] [LOGDIR]
are optional parameters:
-
h
Shows help for the script and exits. -
LOGLEVEL
Sets the logging level for standard errors (stderr). The default is INFO. Options:DEBUG
INFO
WARNING
ERROR
CRITICAL
The logging level for messages in the log file is always DEBUG.
-
LOGDIR
Log directory. The default directory is:/var/log/bi
Redirect HTTP Calls to HTTPS (BI Service Script)
By default, both HTTP and HTTPS access to the Oracle
Analytics Cloud URL is enabled. If you want to redirect all incoming HTTP traffic to HTTPS, you can use the script proxy_redirect_http_to_https
.
For example, if you currently access the service using
http://analytics.mycorp.com/analytics
, you're redirected to
this URL after running the script:
https://analytics.mycorp.com/analytics
. The browser should
confirm the valid certificate.
Script Location
/bi/app/public/bin/proxy_redirect_http_to_https
Syntax
proxy_redirect_http_to_https [-h] [LOGLEVEL] [LOGDIR]
Optional parameters:
-
h
Shows help for the script and exits. -
LOGLEVEL
Sets the logging level for standard errors (stderr). The default isINFO
. Options:DEBUG
INFO
WARNING
ERROR
CRITICAL
The logging level for messages in the log file is always DEBUG.
-
LOGDIR
Log directory. The default directory is:/var/log/bi
Example
proxy_redirect_http_to_https
Enable Database Storage for User Group Memberships for a Nonmetered Service (BI Service Script)
If you have a nonmetered subscription for Oracle
Analytics Cloud - Classic, you might want to store user group memberships in a database and for your service’s authentication provider to access this information when authenticating a user's identity. You can use the script configure_bi_sql_group_provider
to set up the provider and create the tables that you need (GROUPS and GROUPMEMBERS). After you run the script, you must populate the tables with your group and group member (user) information.
Note:
Group memberships that you derive from the SQL provider don't show up in the Users and Roles page in Oracle Analytics Cloud Console as you might expect but the member assignments work correctly.These tables are in the Oracle Database Classic Cloud Service you configured for Oracle Analytics Cloud - Classic and in the schema created for your service. Unlike the on-premises equivalent functionality, you can’t change the location of these tables or the SQL that retrieves the results. Instead, you must populate these fixed tables using any supported means for loading database tables.
Script Location
/bi/app/public/bin/configure_bi_sql_group_provider
To run the script, see Run Administration Scripts.
Syntax
configure_bi_sql_group_provider [-h] [LOGLEVEL] [LOGDIR]
-
-h
Shows help for the script and exits. -
LOGLEVEL
Sets the logging level for standard errors (stderr). The default is INFO. Options:DEBUG
INFO
WARNING
ERROR
CRITICAL
The logging level for messages in the log file is always DEBUG.
-
LOGDIR
Log directory. The default directory is:/var/log/bi
Example
configure_bi_sql_group_provider
Migrate Essbase Applications and Users
You can migrate applications, users, and groups from Oracle Analytics Cloud – Essbase services v17.3.3 (or earlier) to the latest version, using export and import scripts.
Prerequisites
-
Oracle Identity Cloud Service (IDCS) requires that user fields aren’t empty. If you’re enabling IDCS, then in your existing Essbase services and prior to migrating your data, open the Security tab and ensure that all user data fields (including ID, name. email, and role) contain values and aren’t empty.
-
When you export applications, the target file is overwritten. If you want to save the previous version of an exported application, rename it or run the export script with another file name.
-
Before you migrate applications and users, copy the following scripts from the older Essbase service version to the latest version, at the same file location. You can check first whether they already exist on the new service.
-
/u01/app/oracle/tools/acss/BI/esscs_tools/lcm/esscs_lcm.py
-
/u01/app/oracle/tools/acss/BI /esscs_tools/lcm/idcs_users.py
-
/u01/app/oracle/tools/acss/BI /esscs_tools/lcm/ldap_users.py
-
/u01/app/oracle/tools/acss/BI /esscs_tools/lcm/user_group.py
-
/u01/app/oracle/tools/acss/BI /esscs_tools/public/essbase_export.sh
-
/u01/app/oracle/tools/acss/BI /esscs_tools/public/essbase_import.sh
-
Export Script Location
/bi/app/public/bin
Export Syntax
essbase_export.sh filename
Where:
filename
Full path to the tar
archive file that stores all Essbase applications, CSV files of users and groups, and files of settings.
Import Script Location
/bi/app/public/bin
Import Syntax
essbase_import.sh filename
Where:
filename
Name of the tar
created by the export script.
Gather Diagnostic Logs into a ZIP File (Essbase Service Script)
If you're troubleshooting an issue with your service or you need to contact Oracle Support, you can easily collect all available log files into one place.
Syntax
python collect_diagnostic_logs.py [-h] filename
Where:
filename
Full path of the ZIP file that you want to generate.
Optional parameter:
h
Shows help for the script.
Example
python collect_diagnostic_logs.py /tmp/DiagnosticsForEssbaseService
Update Database Credentials (Essbase Service Script)
When you create an Essbase service, various schemas are created and loaded into the Oracle Database Classic
Cloud Service that you selected. If the password expires for this Oracle Database Classic
Cloud Service, you can use the changeRCUPassword
script to update the password that your Essbase service uses to access the database.
The changeRCUPassword
script changes passwords for:
-
Wallet Store stored credentials
-
Bootstrap credentials
-
WebLogic server data sources
-
Oracle database schemas
Syntax
python changeRCUPassword.py <new_password> <sys db_user> <sys db_password>
Where:
new_password
New password for the database.
sys db_user
System database user name.
sys db_password
System database user password.
Example
python changeRCUPassword.py xxxxxxxx dsmith ds112233