The SQL Developer Amazon Redshift Migration Assistant, available with SQL Developer 17.4, provides a framework for easy migration of Amazon Redshift environments on a per-schema basis.
This section describes the steps and the workflow for both an online migration of Amazon Redshift and for the generation of scripts for a scheduled, manual migration that you run at a later time.
Connecting to Amazon Redshift
Using SQL Developer you can migrate database files from Amazon Redshift to Autonomous Data Warehouse. To extract metadata and data from Amazon Redshift for a migration, you need to connect to Amazon Redshift with SQL Developer.
Download Amazon Redshift JDBC Driver and Add the Third Party Driver
Download a Amazon Redshift JDBC driver to access Amazon Redshift. Consult the Amazon Redshift documentation for the location of the most recent JDBC driver. For more information, see Configure a JDBC Connection.
Store the Amazon Redshift JDBC driver in a local directory where SQL Developer can access the driver.
Add the Amazon Redshift JDBC driver as third party to SQL Developer before making a connection. Within SQL Developer, go to Preferences > Database > Third Party JDBC Drivers and add the downloaded Amazon Redshift JDBC driver.
Description of the illustration dwcs_migrate_redshift_jdbc1.png
Add Connection to Amazon Redshift Database
Connect to your Amazon Redshift database.
Create a new database connection and enter the connection information of your Amazon Redshift system. If you are planning to migrate multiple schemas it is recommended to connect with the Master username to your Amazon Redshift system.
Description of the illustration dwcs_migrate_redshift_selectdb_connection.png
If you deployed your Amazon Redshift environment within a Virtual Private Cloud (VPC) you have to ensure that your cluster is accessible from the Internet. See http://docs.aws.amazon.com/redshift/latest/gsg/rs-gsg-authorize-cluster-access.html for details of how to enable public Internet access.
If your Amazon Redshift client connection to the database appears to hang or times out when running long queries, see http://docs.aws.amazon.com/redshift/latest/mgmt/connecting-firewall-guidance.html for possible solutions to address this issue.
Test the connection before you save it.
Starting the Cloud Migration Wizard
Invoke the Cloud Migration Wizard from the Tools menu of SQL Developer to initiate your Amazon Redshift migration to Autonomous Data Warehouse.
Description of the illustration dwcs_migrate_redshift_start.png
The Cloud Migration Wizard is an easy set of steps. The Cloud Migration Wizard guides you to:
Identify the schemas in your Amazon Redshift database that you want to migrate.
Identify the target Autonomous Data Warehouse.
Define whether you want to migrate the metadata (DDL), the data, or both.
Choose to migrate your system online, to generate scripts, or both.
Identify the Amazon Redshift Schemas for Migration and Scope of Migration
Connect to your Amazon Redshift database and select the schemas you want to migrate.
Migration to Autonomous Data Warehouse is on a per-schema basis. All objects in a selected schema are migrated. Schemas cannot be renamed as part of the migration.
Description of the illustration dwcs_migrate_redshift_choice.png
For the schemas you select for migration you can choose one of the following options:
DDL and DATA: Both the schemas and the data are targeted for migration.
DDL: Only the schemas are targeted for migration.
DATA: Only the data is targeted for migration.
If you choose to migrate any data you have to provide the AWS Identity and Access Management (IAM) AWS access key, AWS Secret Access Key, and an existing S3 bucket where the Amazon Redshift data is going to be unloaded and staged. The security credentials have to have proper privileges to store data in S3. It is recommended to create new, separate access keys for the migration.
The same access key is used later on to load the data into the Autonomous Data Warehouse using secure REST requests.
For more information on access keys, see AWS Identity and Access Management.
For information on common s3ServiceException errors, see S3ServiceException Errors.
Define how to Run the Migration
Create a connection for your target Autonomous Data Warehouse if not already existent. The user for this connection must have the administrative privileges since this connection will be used throughout the migration to create additional schemas and objects. It is recommended to use the ADMIN user of the Autonomous Data Warehouse.
The Amazon Redshift Migration Assistant allows you to do an online migration right away, to generate all scripts necessary for a migration, or both. If you chose to store the scripts in a local directory you have to specify a local directory the user can write to.
Description of the illustration dwcs_migrate_redshift_scripts.png
If you select DATA ONLY in Cloud Migrations Step 1 of 3 and you are doing an online migration then the Migration Wizard expects the schemas and its objects to exist (you need to create the schema and objects before running the Migration Wizard).
Reviewing and Finishing the Amazon Redshift Migration
If you choose online migration then the dialog of the migration wizard stays open until the migration finishes. If you choose to generate the scripts only, then the migration process generates the scripts in the local directory you select in the migration wizard.
Press Finish to complete your migration.
Using the Generated Amazon Redshift Migration Scripts
When you choose to generate the migration scripts, a new subdirectory will be created in the local directory specified in the migration process for Autonomous Data Warehouse.
The directory contains the following scripts:
These scripts contain all necessary commands to migrate your Amazon Redshift system to Autonomous Data Warehouse. You can run these scripts in real time or use them for programmatic processing.
Unload Your Amazon Redshift Data into S3
The first step of a successful migration is to unload your Amazon Redshift data into Amazon S3, which acts as a staging area. Script redshift_s3unload.sql has all the Amazon Redshift unload commands to unload the data using the access credentials and the S3 bucket that were specified in the Migration Wizard workflow.
Connect to your Amazon Redshift environment to run this script.
Create Your Data Warehouse Objects
To prepare your Autonomous Data Warehouse create your empty data warehouse schema prior to loading data. The Amazon Redshift Migration Assistant converted all Amazon Redshift schema structures into Oracle structures in script adwc_ddl.sql.
The script must be executed while you are connected to your Autonomous Data Warehouse as privileged user; for example, ADMIN.
By default, the schema created for the migration has the same name as the schema in Amazon Redshift. You must change the password to the valid password for the specified user either in the script or after the script runs. If you want to change the schema name then change the schema name and all references to the name.
Load Your Amazon Redshift Data into Your Oracle Autonomous Data Warehouse
Script adwc_dataload.sql contains all the load commands necessary to load your unloaded Amazon Redshift data straight from S3 into your Autonomous Data Warehouse.
The script must be executed while being connected to your Autonomous Data Warehouse as privileged user; for example ADMIN.
If you want to change the target schema name when you create your data warehouse objects then you must adjust the target schema names in this script accordingly.
Post Migration Tasks
After successful migration of your Redshift environment you should consider the following post-migration tasks:
Drop schema SQLDEV_MIGREPOS
Drop the Amazon S3 bucket used for staging
Harden the Amazon account used for accessing S3
Drop the database credential used for data loading from S3
Harden your accounts in your Autonomous Data Warehouse
Drop schema SQLDEV_MIGREPOS
As part of the schema migration the Migration Assistant installs a minimal migration repository in the target Autonomous Data Warehouse. After the migration this account is no longer needed and can be dropped or alternatively locked.
Drop the Amazon S3 Bucket Used for Staging
Unless you desire to use the unloaded Redshift data otherwise you can drop the bucket containing the unloaded data.
Harden the Amazon Account Used for Accessing S3
You should inactivate the security access key used for S3 access unless needed for other purposes.
The Amazon security credentials to access S3 are stored encrypted as database credential
REDSHIFT_DWCS_CREDS in your Autonomous Data Warehouse in the privileged user schema that was used for the migration. It is recommended to drop this credential after successful migration unless needed for other purposes. For more information, see DROP_CREDENTIAL Procedure.
Harden your Accounts in Your Autonomous Data Warehouse
For the new schema created as part of the migration with the Migration Assistant, ensure to change the passwords of these accounts or lock and expire them if they’re solely used for data storage.