Migrating a MySQL HeatWave DB system to a MySQL Instance

You can migrate the data stored in a MySQL HeatWave DB system to a MySQL instance that is running on-premises or in other cloud vendors as managed or unmanaged services. You can also export the database tables into text data to be used in other databases or applications.

MySQL Shell is an advanced client and code editor for MySQL. It provides SQL functionality plus scripting capabilities for JavaScript and Python, and includes various APIs and utilities for working with MySQL databases. MySQL Shell has dump and load utilities for exporting and importing data stored in MySQL instances. The version of MySQL Shell should be equal or higher than the version of the MySQL instance. It is recommended to install and use the latest version of MySQL Shell.

Exporting a MySQL HeatWave DB system

You can export the data of your MySQL HeatWave DB system by using one of the following MySQL Shell dump utilities:
  • util.dumpInstance(outputUrl[, options]): MySQL instance export utility that exports all compatible schemas to an Object Storage bucket or to local files. By default, this utility exports users, events, routines, and triggers.
  • util.dumpSchemas(schemas, outputUrl[, options]): MySQL schema export utility that exports selected schemas to an Object Storage bucket or to local files.
  • util.dumpTables(schema, tables, outputUrl[, options]): MySQL table export utility that exports selected tables of a schema to an Object Storage bucket or to local files.
You can write the exported dump files into one of the following locations:
  • Local file system in the client machine running MySQL Shell
  • Oracle Cloud Infrastructure Object Storage
  • Amazon Web Services (AWS) S3-compatible bucket
  • Microsoft Azure Blob Storage

You can install MySQL Shell in an Oracle Cloud Infrastructure compute instance or another client machine that can connect to the MySQL HeatWave DB system containing data that you want to export. Then, run MySQL Shell to connect to the DB system with a user account that has sufficient privileges to access the data that you want to export. Once connected to the DB system, you can run the dump utilities to export the data.

Some examples:
  • Export all schemas of a MySQL HeatWave DB system and store the dump files in a file system directory named export-instance:
    shell-js> util.dumpInstance("export-instance", {ocimds: true})
    When the ocimds option is set to true in an instance dump, all MySQL HeatWave specific database objects and user accounts are excluded in the export. It exports all schemas, tables, users, events, routines, and triggers created by the user in the MySQL HeatWave DB system which includes:
    • All user-created schemas together with all database objects and data in the schemas.
    • All user-created database user accounts together with their granted privileges.
  • Export the world schema and store the dump files in a file system directory named export-schemas-world:
    shell-js> util.dumpSchemas(["world"], "export-schemas-world")

    This example exports all database objects and data in the world schema. Database user accounts are not exported.

  • Export city and country tables in the world schema and store the dump files in a file system directory named export-tables-city-country:
    shell-js> util.dumpTables("world", ["city", "country"],"export-tables-city-country")

    This example exports all data in the city and country tables in the world schema. Other database objects in the world schema such as stored routines and events are not exported. Database user accounts are also not exported.

  • Export city and country tables in the world schema and store the dump files containing csv text files in a file system directory named export-csv-city-country:
    shell-js> util.dumpTables("world", ["city", "country"],"export-csv-city-country", {dialect: "csv", compression: "none"})

    This example exports all data in the city and country tables in the world schema into uncompressed csv text files. These text files can be loaded into other databases or applications, if needed.

  • Export the MySQL instance and store the dump files in an Oracle Cloud Infrastructure bucket specified by a Pre-Authenticated Request (PAR):
    shell-js> util.dumpInstance("https://objectstorage.region.oraclecloud.com/p/secret/n/myTenancy/b/myBucket/o/", {ocimds: true})
    The bucket must be empty initially.
  • Export the MySQL instance and store the dump files in an Oracle Cloud Infrastructure bucket with a prefix specified by a Pre-Authenticated Request (PAR):
    shell-js> util.dumpInstance("https://objectstorage.region.oraclecloud.com/p/secret/n/myTenancy/b/myBucket/o/MyPrefix/", {ocimds: true})
    The bucket must not have any object with the same prefix.

For a reference of all available options, see MySQL Shell Dump Utilities.

Importing into a MySQL Instance

You can use MySQL Shell dump loading utility to import the data exported using the MySQL Shell dump utilities into any MySQL instance. The MySQL instance may be running in on-premises or in other cloud vendors as managed or unmanaged services.
util.loadDump(url[, options])

Run MySQL Shell on a client machine and connect to the MySQL instance with a user account that has required privileges to import the data. Once connected to the MySQL instance, you can run the MySQL Shell dump loading utility to import the data.

Some examples:
  • This example imports all the database objects and data from the dump files in the file system directory named export-instance:
    shell-js> util.loadDump("export-instance")
  • This example imports all the database objects and data from an Oracle Cloud Infrastructure bucket specified by a Pre-Authenticated Request (PAR):
    shell-js> util.loadDump("https://objectstorage.region.oraclecloud.com/p/secret/n/myTenancy/b/myBucket/o/", progressFile: "progress.json"})
    When importing from an Oracle Cloud Infrastructure bucket using a PAR, a local progress file is required to keep track of the state of the import.
  • This example imports all the database objects and data from an Oracle Cloud Infrastructure bucket with a prefix specified by a Pre-Authenticated Request (PAR):
    shell-js> util.loadDump("https://objectstorage.region.oraclecloud.com/p/secret/n/myTenancy/b/myBucket/o/MyPrefix/", progressFile: "progress.json"})
    When importing from an Oracle Cloud Infrastructure bucket using a PAR, a local progress file is required to keep track of the state of the import.

For a reference of all available options, see MySQL Shell Dump Loading Utilities.