This topic describes the notes and procedure for data migration.

Notes

When you perform an online migration, take note of the following issues:
  • A migration job occupies the network resources of the source data address and destination data address. To ensure business continuity, we recommend that you specify a speed limit for a migration job or perform the migration job during off-peak hours.
  • Before a migration job is performed, files at the source data address and the destination data address are checked. The files at the destination data address are overwritten if the source files have the same name as the destination files and have a later modification time. If two files have the same name but different content, you must change the name of one file or back up the files.
  • Data Transport only allows you to migrate data of a single bucket at a time, rather than all data for an account.

Step 1: Create a source data address

  1. Log on to the Data Transport console.
  2. Choose Data Online Migration > Data Address, and click Create Data Address.
  3. In the Create Data Address panel, configure the following parameters and click OK.
    Parameter Required Description
    Data Type Yes Select QI NIU.
    Data Name Yes Enter a name that is 3 to 63 characters in length. Special characters are not supported, except for hyphens (-) and underscores (_).
    Endpoint Yes Enter an endpoint that corresponds to the region where the specified bucket is located. The format is http://<a Qiniu Cloud Integrated CDN (FUSION) test domain name> or http://<a custom FUSION accelerated domain name>.

    You can log on to the Qiniu Cloud console. Open the Object Storage page, find the bucket, and view the corresponding domain name in the Test Domain column or the Accelerated Domain column.

    Notice A daily flow limit of 10 GB is set for FUSION test domains. If the size of data to be migrated exceeds 10 GB, we recommend that you migrate the data in batches or use FUSION accelerated domains. For more information, see Limits on test domains.
    Bucket Yes Enter the name of a Qiniu Cloud Object Storage (KODO) bucket. You need only to enter the custom name. For example, if the bucket name is tony-1234567890, enter tony in the field.
    Prefix Yes
    • Migrate All Data: All the data in the bucket is migrated.

      When you migrate all data, you do not need to specify a prefix.

    • Migrate Partial Data: Only the files in a specified directory (prefix) are migrated. A prefix cannot start with a forward slash (/) and must end with a forward slash (/). For example, you can specify data/to/oss/ as the prefix.
    Access Key and Secret Key Yes Specify the cloud API key pair that is used to migrate data. We recommend that you create a key for this migration and delete it after the migration.
  4. You must apply for whitelist permissions because this feature is in the public preview phase. Click Application.
  5. Enter the required information and submit the application for using this feature. After the application is approved, you will receive a short message service (SMS) notification.

Step 2: Create a destination data address

  1. Choose Data Online Migration > Data Address, and click Create Data Address.
  2. In the Create Data Address panel, configure the parameters and click OK.
    Parameter Required Description
    Data Type Yes Select OSS.
    Data Region Yes Select the region where the destination data address is located.
    Data Name Yes Enter a name that is 3 to 63 characters in length. Special characters are not supported, except for hyphens (-) and underscores (_).
    OSS Endpoint Yes
    Select an endpoint based on the region where your data is located.
    • http://oss-cn-endpoint.aliyuncs.com: indicates that you use an HTTP-based endpoint to access Object Storage Service (OSS) over the Internet.
    • https://oss-cn-endpoint.aliyuncs.com: indicates that you use an HTTPS-based endpoint to access OSS over the Internet.
    For more information, see Endpoint.
    Note When you migrate third-party data to OSS, you must access OSS from an Internet endpoint.
    Access Key Id and Access Key Secret Yes Specify the AccessKey pair that is used to migrate data. For more information, see Create an AccessKey pair.
    OSS Bucket Yes Select a bucket to store the migrated data.
    OSS Prefix No An OSS prefix cannot start with a forward slash (/) and must end with a forward slash (/). For example, you can specify data/to/oss/ as the OSS prefix. Do not specify this parameter if you want to migrate data to the root directory of the specified bucket.
    Notice If the name of a source file starts with a forward slash (/), you must specify an OSS prefix when you configure the destination data address. If no OSS prefix is specified, the migration job fails. For example, if the name of a file to be migrated is /test/test.png, you must specify an OSS prefix, such as oss/. After the migration job is completed, the name of the OSS file changes from /test/test.png to oss//test/test.png.

Step 3: Create a migration job

  1. Choose Data Online Migration > Migration Jobs, and click Create Job.
  2. In the Create Job panel, read the Terms of Migration Service and select I understand the above terms and conditions, and apply for opening data migration service. Click Next.
    Then, the Fee Reminder dialog box appears. OSS_billing request
  3. In the Create Job dialog box, configure the parameters and click Next.
    Parameter Required Description
    Job Name Yes A job name must be 3 to 63 characters in length and can contain lowercase letters, digits, and hyphens (-). It cannot start or end with a hyphen (-).
    Source Data Address Yes Select the source data address that you have created.
    Destination Data Address Yes Select the destination data address that you have created.
    Notice If the source data address and the destination data address are located in different countries or regions, you must submit a ticket to request permissions to create a cross-country or cross-region migration job. You must ensure that your business is legitimate, data transit conforms to local rules and regulations, and data does not include illegal information.
    Specified Directory No
    • Do not filter: All data at the source data address is migrated.
    • Exclude: The files and subdirectories in the excluded directories are not migrated.
    • Contain: Only the files and subdirectories in the specified directories are migrated.
    Note
    • A directory cannot start with a forward slash (/) or a backslash (\), and cannot contain double slashes (//), double periods (..), or double quotation marks ("). The character string that consists of all the specified directory names cannot exceed 10 KB in size.
    • A directory must end with a forward slash (/), for example, docs/.
    • You can specify a maximum of 20 directories of the Exclude or Contain type.
    Migration Type Yes
    • Full: specifies a full migration job. You must specify the Start Time Point of File parameter. Files with the last modification time later than the specified start time point are migrated. After the files are migrated, the migration job is closed. You can submit the job again if the data at the source data address changes. In this case, Data Transport only migrates the data that is changed after the previous job.
    • Incremental: specifies an incremental migration job. To perform an incremental job, specify the Migration Interval and Migration Times parameters based on your needs. You must specify the Start Point Time of File parameter. Files with the last modification time later than the specified start time point are migrated during the first migration. After the first migration is complete, an incremental migration is performed based on the migration interval. An incremental migration job only migrates files that are created or modified after the previous migration started and before this migration starts. Assume that you specify N for the migration times. Full migration is performed once. Then, incremental migration is performed (N-1) times. For example, you can set the migration interval to 1, the migration times to 5, and the start time point to 2019-03-05 08:00. The present time is 2019-03-10 08:00. When the first migration starts, Data Transport migrates files that are last modified between 2019-03-05 08:00 and 2019-03-10 08:00. Assume that the first migration requires 1 hour to complete. The second migration starts at 2019-03-10 10:00. The two hours from 8:00 to 10:00 include the time period that the first migration requires (1 hour) and the migration interval (1 hour). During the second migration, files that are last modified between 2019-03-10 08:00 and 2019-03-10 10:00 are migrated. The migration job includes a full migration and four incremental migrations.
    Notice Before you start a migration job, Data Transport compares files of the source data address with those of the destination data address. If a source file has the same name as a destination file, the destination file is overwritten when one of the following conditions is met:
    • The content types of the source file and the destination file are different.
    • The source file is updated after the previous migration.
    • The size of the source file is different from that of the destination file.
    Start Time Point of File Yes
    • All: All files are migrated.
    • Assign: Files that are created or modified after the specified time are migrated. For example, if you set the start time point to 2018-11-01 08:00:00, only files that are created or modified after 2018-11-01 08:00:00 are migrated. Files that are created or modified before the specified time are skipped.
    Migration Interval Yes (only for incremental migration) The default value is 1 hour and the maximum value is 24 hours.
    Migration Times Yes (only for incremental migration) The default value is 1 and the maximum value is 30.
  4. On the Performance tab, navigate to the Data Prediction section and specify the Data Size and File Count parameters.
    Note To ensure a successful migration, estimate the amount of data to be migrated as accurately as possible. For more information, see Estimate the amount of data to be migrated.
  5. Optional. On the Performance tab, navigate to the Flow Control section, specify the Time Range and Max Flow parameters, and then click Add.
    Note To ensure business continuity, we recommend that you specify the Time Range and Max Flow parameters based on the workload peaks and troughs.
  6. Click Create and wait until the migration job is completed.

FAQ

What can I do if the migration rate is lower than expected?
  1. Check whether you have used a FUSION test domain of Qiniu Cloud. We recommend that you use FUSION accelerated domains instead of test domains to create migration jobs. Test domains have limits on bandwidth and the number of visits from a single IP address.
  2. Check the application scenario of your domain. The available bandwidth varies based on the application scenarios of Qiniu Cloud domains. For example, low bandwidth is allocated if you select websites as the application scenario of a domain. In this scenario, if you specify this domain as the endpoint of the source data address, the transmission rate for data migration is slow due to low bandwidth. You must submit a ticket to Qiniu Cloud to change the application scenario of the domain.