F Using Oracle WebLogic SIP Container Export/Import

The following sections describe how to use Oracle WebLogic Server SIP Container Export and Import functions:

Note:

All examples in this document use Linux commands; adapt commands as appropriate for other operating systems.

F.1 Export

This section details the steps you must take in order to export your OWLSC data.

F.1.1 Export the Database Data from the Current Environment

Note:

The first four steps in this procedure are performed on your RDBMS machine.

Export data by using one of the Oracle database export utilities. Before executing the export commands, on the existing database machine, create a directory on the file system where the exported data dump files and log files will be stored.

$ mkdir ~/owlcs_data_pump_dir
  1. Change directory to the $ORACLE_HOME/bin directory of your database installation and execute the following commands. These commands can be executed as the sys user or the system user.

  2. Create DATA_PUMP_DIR in your database using the following commands:

    • $ ./sqlplus sys@<db_service> as sysdba

    • Enter password for sys user at the prompt.

    • At the SQL prompt, execute the following commands:

      SQL> create or replace directory DATA_PUMP_DIR as '<full_path_to_the_data_pump_dir_created_above>';
      SQL> commit;
      SQL> quit
      
  3. Export data in the subscriber data services schema.

    • $ ./expdp "'sys@<db_service> as sysdba'" SCHEMAS=<current_prefix>_orasdpsds DIRECTORY=DATA_PUMP_DIR DUMPFILE=<current_prefix>_orasdpsds.dmp LOGFILE=<current_prefix>_orasdpsds.log

    • Enter password for sys user at the prompt.

      Note:

      The value of the DIRECTORY parameter in the command line above should be specified as DATA_PUMP_DIR and not as the actual directory on the file system. When the command is done, verify that two files, one corresponding to DUMPFILE and the other corresponding to LOGFILE are created in your data pump directory.
  4. Export data in the XDMS schema.

    • $ ./expdp "'sys@<db_service> as sysdba'" SCHEMAS=<current_prefix>_orasdpsds DIRECTORY=DATA_PUMP_DIR DUMPFILE=<current_prefix>_orasdpsds.dmp LOGFILE=<current_prefix>_orasdpsds.log

    • Enter password for sys user at the prompt.

      Note:

      The value of the DIRECTORY parameter in the command line above should be specified as DATA_PUMP_DIR and not as the actual directory on the file system. When the command is done, verify that two files, one corresponding to DUMPFILE and the other corresponding to LOGFILE are created in your data pump directory.

      Exporting of the Location Service schema is not required, since location data is recreated when clients sign in to the server.

  5. Complete OWSM Policy Migration. Start WLST by running the following command:

    $ORACLE_HOME/common/bin/wlst.sh.
    
  6. Connect to the local WLS instance by running the following command:

    wls:/offline> connect('weblogic','weblogic','127.0.0.1:7001')
    

    In this example, 'weblogic'/'weblogic' are sample WLS admin username/password. Replace them with the real values in your environment. The port may change if you have another instance of WLS running (this is the WLS AdminServer port).

  7. Run the following WLST commands to export the policies and assertion templates. Replace "wlcs_server1" with your OWLSC instance name.

    wls:/base_domain/serverConfig> exportMetadata(application='wsm-pm',server='wlcs_server1',docs='/assertiontemplates/**',toLocation='/tmp/owsmexport/')
    wls:/base_domain/serverConfig> exportMetadata(application='wsm-pm',server='wlcs_server1',docs='/policies/**',toLocation='/tmp/owsmexport/')
    
  8. Exit the WLST command line tool by running the following command:

    wls:/base_domain/serverConfig> exit()
    
  9. (Performed on the machine in which the OWLSC instance is installed [your middleware machine]) Next, export the Credential Store. OWSM stores client policy username and password credentials and keystore passwords in the credential store. Copy <domain>/config/fmwconfig/cwallet.sso from current machine to the new machine. If this is already performed as part of OPSS migration, then you can omit this step.

  10. Export UMS-related details to the new environment (if UMS is installed in your environment). Export UMS-related details to your new environment (if UMS is installed in your environment). If the User Messaging Service is used in the current and new environments, perform the following step:

    • Start WLST by running the following command:

      $ORACLE_HOME/common/bin/wlst.sh
      

      Note:

      This is the ORACLE_HOME on the middleware instance. By default that is <middleware_home>/as11gr1wlcs1 directory, where <middleware_home> is the directory where OWLSC is installed.
  11. Run the following WLST commands to download the user messaging preferences from the backend database to the specified xml file:

    wls:/offline> manageUserMessagingPrefs(operation='download', filename='/tmp/userprefs-dump.xml', url='t3://localhost:8001', username='weblogic', password='weblogic')
    

    Note:

    In the above sample 'weblogic'/'weblogic' are sample WLS admin username/password combinations. Replace them with the real values in your environment. 8001 is the Managed Server Port where UMS is running. Replace it accordingly with the appropriate value.
  12. Exit the WLST command line tool by running the following command:

    wls:/offline> exit()
    

F.2 Import

This section details the steps you must take in order to import your OWLSC data.

  1. Import Database data into the Production Environment. This is achieved by using one of the Oracle database import utilities. Before executing the import commands, on the new database machine, create a directory on the file system where the data dump files to be imported and log files will be stored:

    $ mkdir ~/owlcs_data_pump_dir
    
  2. Copy (ftp or remote copy) all dmp files from the data pump directory on your old database machine to the above directory.

  3. Change directory to the $ORACLE_HOME/bin of your production database installation and execute the following commands. These commands can be executed as the sys user or the system user.

    1. Create DATA_PUMP_DIR in your production database with the following commands:

      $ ./sqlplus sys@<db_service> as sysdba
      

      Enter password for sys user at the prompt

      At the SQL prompt, execute the following command:

      SQL> create or replace directory DATA_PUMP_DIR as '<full_path_to_the_data_pump_dir_created_above>'
      SQL> commit;
      SQL> quit
      
    2. Drop all sequences in the <production_prefix>_orasdpsds schema with the following commands:

      # $ ./sqlplus sys@<db_service> as sysdba
      

      Enter password for sys user at the prompt

      At the SQL prompt, execute the following commands:

      SQL> alter session set current_schema=<production_prefix>_orasdpsds;
      SQL> drop sequence ACCOUNT_SEQ;
      SQL> drop sequence CREDENTIALS_SEQ;
      SQL> drop sequence PRIVATE_IDENTITY_SEQ;
      SQL> drop sequence PUBLIC_IDENTITY_SEQ;
      SQL> drop sequence REALM_SEQ;
      SQL> drop sequence ROLE_SEQ;
      SQL> commit;
      SQL> quit
      
    3. Import data into the subscriber data services schema

      $ ./impdp "'sys@<db_service> as sysdba'" REMAP_SCHEMA=<test_prefix>_orasdpsds:<production_prefix>_orasdpsds TABLE_EXISTS_ACTION=REPLACE DIRECTORY=DATA_PUMP_DIR DUMPFILE=<test_prefix>_orasdpsds.dmp LOGFILE=<production_prefix>_orasdpsds.log
      

      Enter password for sys user at the prompt.

      Note that the value of the DIRECTORY parameter in the command line above should be specified as DATA_PUMP_DIR and not as the actual directory on the file system. Ignore error that user already exists.

    4. Export data into the XDMS schema

      $ ./impdp "'sys@<db_service> as sysdba'" REMAP_SCHEMA=<test_prefix>_orasdpxdms:<production_prefix>_orasdpxdms TABLE_EXISTS_ACTION=REPLACE DIRECTORY=DATA_PUMP_DIR DUMPFILE=<test_prefix>_orasdpxdms.dmp LOGFILE=<production_prefix>_orasdpxdms.log
      

      Enter password for sys user at the prompt.

      Note that the value of the DIRECTORY parameter in the command line above should be specified as DATA_PUMP_DIR and not as the actual directory on the file system. Ignore error that user already exists.

      Importing of the Location Service schema is not required, since location data is recreated when clients sign in to the server.

  4. Copy the /tmp/owsmexport directory to the new machine.

  5. Start WLST by running the following command:

    $ORACLE_HOME/common/bin/wlst.sh
    
  6. Connect to the local WLS instance by running the following command:

    wls:/offline> connect('weblogic','weblogic','127.0.0.1:7001')
    

    Note:

    In the above sample 'weblogic'/'weblogic' are sample WLS admin username/password. Replace them with the real values in your environment. The port may change if you have another instance of WLS running (this is the WLS AdminServer port).
  7. Run the following WLST commands to replace the policies and assertion templates. Replace wlcs_server1 with your OWLSC instance name:

    wls:/base_domain/serverConfig> deleteMetadata(application='wsm-pm',server='wlcs_server1',docs='/assertiontemplates/**')
    wls:/base_domain/serverConfig> deleteMetadata(application='wsm-pm',server='wlcs_server1',docs='/policies/**')
    wls:/base_domain/serverConfig> importMetadata(application='wsm-pm',server='wlcs_server1',docs='/assertiontemplates/**',fromLocation='/tmp/owsmexport/')
    wls:/base_domain/serverConfig> importMetadata(application='wsm-pm',server='wlcs_server1',docs='/policies/**',fromLocation='/tmp/owsmexport/')
    
  8. Exit the WLST command line tool by running the following command:

    wls:/base_domain/serverConfig> exit()
    

    Note:

    Regarding importing OWSM Keystore:
    • Private keys will differ between current and new environments. So, they do not need to be migrated.

    • Public keys, intermediate certs, and root certs may be migrated from current to new environments. Use java keytool export and import commands to move them. After doing so, review if these certs are applicable in the new environment based on the clients invoking the services in the new environment.

    • Review if new public keys of client certs, intermediate CA certs or root CA certs must be added to the new keystore based on the clients invoking the services in the new environment.

    For more details, see Oracle Fusion Middleware Security and Administrator's Guide for Web Services.

  9. Go to the new machine.

  10. Start WLST by running the following command:

    $ORACLE_HOME/common/bin/wlst.sh
    
  11. Run the following WLST command to upload the User Messaging Preferences from file to the backend database:

    wls:/offline> manageUserMessagingPrefs(operation='upload', filename='/tmp/userprefs-dump.xml', url='t3://localhost:8001', username='weblogic', password='weblogic')
    

    Note:

    In the above sample 'weblogic'/'weblogic' are sample WLS admin username/password. Replace them with the real values in your environment. 8001 is the Managed Server Port where UMS is running. Replace it accordingly with the appropriate value.
  12. Observe the message displayed for successful upload. Exit the WLST command line tool by running the following command:

    wls:/offline> exit()
    

    Note:

    For different options on performing download or upload, execute help ('manageUserMessagingPrefs') at the wls:/offline> prompt. User devices provisioned in the LDAP store are dynamic; the assumption is that both the current and new environments will point to the same LDAP store or will be re-configured to use the same set of information.