This topic describes how to migrate the data of some or all databases from a self-managed SQL Server instance to an ApsaraDB RDS for SQL Server instance by using full backup files.

Prerequisites

  • The RDS instance runs one of the following SQL Server versions and RDS editions:
    • SQL Server 2017 EE or SQL Server 2019 EE on RDS Cluster Edition
    • SQL Server 2008 R2, SQL Server 2012 SE, SQL Server 2012 EE, SQL Server 2016 SE, SQL Server 2016 EE, SQL Server 2017 SE, or SQL Server 2019 SE on RDS High-availability Edition
    • SQL Server 2012 EE Basic, SQL Server 2012 Web, or SQL Server 2016 Web on RDS Basic Edition
  • The source database is a self-managed SQL Server database.
  • If the credentials of a Resource Access Management (RAM) user are used to migrate data, make sure that the AliyunOSSFullAccess and AliyunRDSFullAccess policies are attached to the RAM user. For more information about how to grant permissions to RAM users, see Use RAM to manage OSS permissions and Use RAM to manage ApsaraDB RDS permissions.

Background information

To migrate data from multiple databases to RDS instances, ApsaraDB RDS for SQL Server provides an instance-level migration method to migrate the data of some or all databases from a self-managed instance to an RDS instance. You need to only upload the full backup files of these databases to the same folder in an Object Storage Service (OSS) bucket, and then run the required script to migrate data to an RDS instance.

Usage notes

Only full backup files can be used for the data migration.

Preparations

  1. Install Python 2.7.18. For more information, visit the Python official website.
  2. Check whether Python 2.7.18 is installed.
    • Windows operating systems

      Run the c:\Python27\python.exe -V command to check the Python version. If Python 2.7.18 is displayed, Python 2.7.18 is installed.

      If the system prompts that the preceding command is not an internal or external command, add the Python installation path and the pip command path to the Path environment variable.

      Configure the Path environment variable
    • Mac, Linux, or Unix operating systems

      Run the python -V command to check the Python version. If Python 2.7.18 is displayed, Python 2.7.18 is installed.

  3. Use one of the following methods to install the SDK dependency package:
    • Run the pip command.
      pip install aliyun-python-sdk-rds
      pip install oss2
    • Use the source code.
      # Clone the API repository.
      git clone https://github.com/aliyun/aliyun-openapi-python-sdk.git
      # Install the SDK core repository of Alibaba Cloud.
      cd aliyun-python-sdk-core
      python setup.py install
      # Install the ApsaraDB RDS SDK.
      cd aliyun-python-sdk-rds
      python setup.py install
      # Clone the OSS SDK.
      git clone https://github.com/aliyun/aliyun-oss-python-sdk.git
      cd aliyun-oss-python-sdk
      # Install oss2.
      python setup.py install
  4. Create an OSS bucket. Make sure that the OSS bucket resides in the same region as the RDS instance. For more information, see Create buckets.
    Note Skip this step if an OSS bucket that resides in the same region as the RDS instance already exists.
  5. Create databases on the RDS instance. Make sure that each database whose data you want to migrate from the self-managed instance has a counterpart with an identical name on the RDS instance. In addition, keep the created databases empty.
  6. Back up some or all databases on the self-managed instance.
    Warning
    • For data consistency purposes, we recommend that you stop data writes to these databases during the full backup.
    • If you do not use the backup script to perform the full backup, the names of the generated backup files must follow the Database name_Backup type_Backup time.bak format. Example: Testdb_FULL_20180518153544.bak.
    1. Download the backup script file.
    2. Double-click the backup script file to open it by using Microsoft SQL Server Management Studio (SSMS).
    3. Configure the following parameters.
      Parameter Description
      @backup_databases_list The name of the self-managed database that you want to back up. If you specify multiple databases, separate the names of these databases with semicolons (;) or commas (,).
      @backup_type The backup type. Valid values:
      • FULL: full backup
      • DIFF: incremental backup
      • LOG: log backup
      Notice In this example, set the value to FULL.
      @backup_folder The directory that is used to store backup files on the self-managed database. If the specified directory does not exist, the system automatically creates one.
      @is_run Specifies whether to perform a backup or a check. Valid values:
      • 1: performs a backup.
      • 0: performs a check.
      Examples:
      SELECT
          /**
          * Databases list needed to backup, delimiter is : or ,
          * empty('') or null: means all databases excluding system database
          * example: '[testdb]: TestDR, Test, readonly'
          **/
          @backup_databases_list = N'[dtstestdata],[testdb]'
          @backup_type = N'FULL',                    -- Backup Type? FULL: FULL backup; DIFF: Differential backup; LOG: Log backup
          @backup_folder = N'C:\BACKUP'              -- Backup folder to store backup files.
          @is_run = 1                                -- Check or run? 1, run directly; 0, just check
    4. Run the backup script to back up the specified databases and store the backup files in the specified directory.
      Execution results of the backup script
  7. Use one of the following methods to upload the backup files to the OSS bucket that you created.
    Warning Make sure that the OSS bucket and the RDS instance reside in the same region. This allows the RDS instance to read the backup files at faster speeds. This also helps prevent failures that may occur if the backup files cannot be downloaded.
    Method Description
    Use ossbrowser to upload backup files We recommend that you use ossbrowser to upload backup files to the OSS bucket. For more information, see Use ossbrowser.
    Notice If the self-managed instance is deployed in an Elastic Compute Service (ECS) instance that resides in a virtual private cloud (VPC), you can upload the backup files to the specified OSS bucket by using the VPC endpoint of the OSS bucket. This improves upload efficiency.
    Use the OSS console to upload backup files If the size of backup files is smaller than 5 GB, you can upload the files by using the OSS console. For more information, see Upload objects.
    Call OSS API operations to upload backup files If you want to upload the backup files unattended, use the OSS API operations to perform resumable uploads. For more information, see Multipart upload.

Procedure

  1. Download the migration script package.
  2. Decompress the migration script package and run the following command to view the parameters that you need to specify:
    python ~/Downloads/RDSSQLCreateMigrateTasksBatchly.py -h

    A similar result is returned:

    RDSSQLCreateMigrateTasksBatchly.py -k <access_key_id> -s <access_key_secret> -i <rds_instance_id> -e <oss_endpoint> -b <oss_bucket> -d <directory>
    Table 1. Parameters
    Parameter Description
    access_key_id The AccessKey ID of the Alibaba Cloud account to which the RDS instance belongs.
    access_key_secret The AccessKey secret of the Alibaba Cloud account to which the RDS instance belongs.
    rds_instance_id The ID of the RDS instance.
    oss_endpoint The endpoint of the OSS bucket that stores the backup files. For more information about how to obtain the endpoint, see Bucket overview.
    oss_bucket The name of the OSS bucket that stores the backup files.
    directory The folder that stores the backup files in the OSS bucket. If the backup files are stored in the root folder, enter a forward slash (/).
  3. Run the migration script to complete the migration task.

    Examples:

    You can run the migration script to migrate all the backup files that meet the specified conditions from the Migrationdata folder in the OSS bucket named testdatabucket to the RDS instance whose ID is rm-2zesz5774ud8s****.

    python ~/Downloads/RDSSQLCreateMigrateTasksBatchly.py -k LTAIQ**** -s BMkIUhroub******** -i rm-2zesz5774ud**** -e oss-cn-beijing.aliyuncs.com -b testdatabucket -d Migrationdata
  4. View the progress of the migration task.
    1. Access RDS Instances, select a region at the top, and then click the ID of the target RDS instance.
    2. Perform the following steps based on the SQL Server version that is run on your RDS instance:
      • SQL Server 2008 R2

        In the left-side navigation pane, click Database Migration to Cloud. You can view all the migration tasks that you have submitted.

        Note You can click Refresh in the upper-right corner of the page to view the latest status of the migration tasks.
        View the progress of migration tasks
      • SQL Server 2012 and later

        In the left-side navigation pane, click Backup and Restoration. Then, click the Backup Data Upload History tab.

        Note By default, the migration records over the last seven days are displayed. You can specify a time range to view the migration tasks over the specified time range.
        View the migration task

Common errors

Error message Cause Solution
HTTP Status: 404 Error:InvalidAccessKeyId.NotFound Specified access key is not found. RequestID: XXXXXXXXXXXXXXXXX The AccessKey ID that is used to call API operations is invalid. Use the valid AccessKey ID and AccessKey secret. For more information, see FAQ about AccessKey pairs.
HTTP Status: 400 Error:IncompleteSignature The request signature does not conform to Aliyun standards. server string to sign is:...... The AccessKey secret that is used to call API operations is invalid.
RDS engine doesn't support, this is only for RDS SQL Server engine. The RDS instance to which you want to migrate data does not run SQL Server. Use an RDS instance that runs SQL Server.
Couldn't find specify RDS [XXX]. The ID of the RDS instance does not exist. Check whether the ID of the RDS instance is valid. If the ID of the RDS instance is invalid, enter the valid instance ID.
{'status': -2, 'request-id': '', 'details': "RequestError: HTTPConnectionPool(host='xxxxxxxxxxxxxxxxx', port=80): Max retries exceeded with url: /?bucketInfo= (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x10e996490>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known',))"} The endpoint that is used to connect to the OSS bucket is invalid. Check whether the endpoint that is used to connect to the OSS bucket is valid. If the endpoint is invalid, enter the valid endpoint. For more information, see Bucket overview.
{'status': 404,'-id': 'xxxxxxxxx', 'details': {'HostId': 'xxxxxxxxx', 'Message': 'The specified bucket does not exist.', 'Code': 'NoSuchBucket', 'RequestId': 'xxxxxxxx', 'BucketName': 'aaaatp-test-on-ecs'}} The OSS bucket does not exist. Check whether the entered name of the OSS bucket is valid. If the entered name is invalid, enter the valid name.
There is no backup file on OSS Bucket [xxxxxx] under [xxxxxxxxx] folder, check please. The required folder does not exist in the OSS bucket, or the folder does not contain the backup files that meet the specified conditions. Check whether the folder exists in the OSS bucket and whether the folder contains the backup files that meet the specified conditions. If the folder does not exist in the OSS bucket and the folder does not contain backup files that meet the specified conditions, create the folder in the OSS bucket and import backup files that meet the specified conditions.
Warning!!!!!, [autotest_2005_ent_broken_full_dbcc_failed.bak] is not backup file, filtered. The names of the backup files do not meet the naming conventions. If you do not use the backup script to perform the full backup, the names of the generated backup files must follow the Database name_Backup type_Backup time.bak format. Example: Testdb_FULL_20180518153544.bak.
HTTP Status: 403 Error:Forbidden.RAM The user is not authorized to operate the specified resource, or this operation does not support RAM. RequestID: xxxxx{'status': 403, 'request-id': 'xxxx', 'details': {'HostId': 'atp-test-on-ecs.oss-cn-beijing.aliyuncs.com', 'Message': 'The bucket you visit is not belong to you.', 'Code': 'AccessDenied', 'RequestId': 'xxxx'}} The RAM user does not have the required permissions. Attach the AliyunOSSFullAccess and AliyunRDSFullAccess policies to the RAM user. For more information about how to authorize a RAM user, see Authorize a RAM user.
OPENAPI Response Error !!!!! : HTTP Status: <Http Status Code> Error:<Error> <Description>. RequestID: 32BB6886-775E-4BB7-A054-635664**** An error occurs when an API operation is called. Analyze the specific error cause based on the error information that is described in Table 2.
Table 2. Error codes
HTTP status code Error code Description Description
403 InvalidDBName The specified database name is not allowed. The error message returned because the specified database names are invalid. For example, if the name of a database is the same as the name of a system database, the name of the database is invalid.
403 IncorrectDBInstanceState Current DB instance state does not support this operation. The error message returned because the RDS instance is not in a required state. For example, the RDS instance is in the Creating state.
400 IncorrectDBInstanceType Current DB instance type does not support this operation. The error message returned because the RDS instance does not run SQL Server.
400 IncorrectDBInstanceLockMode Current DB instance lock mode does not support this operation. The error message returned because the RDS instance is in a locking state that does not support the operation.
400 InvalidDBName.NotFound Specified one or more DB name does not exist or DB status does not support. The error message returned because the specified databases cannot be found.
  • SQL Server 2008 R2: Create databases on the RDS instance before the data migration. Make sure that each database whose data you want to migrate from the self-managed instance has a counterpart with an identical name on the RDS instance.
  • SQL Server 2012 and later: Make sure that each database whose data you want to migrate from the self-managed instance does not have a counterpart with an identical name on the RDS instance.
400 IncorrectDBType Current DB type does not support this operation. The error message returned because the operation is not supported by the database engine that is run on the RDS instance.
400 IncorrectDBState Current DB state does not support this operation. The error message returned because the databases are being created or receiving data from another migration task.
400 UploadLimitExceeded UploadTimesQuotaExceeded: Exceeding the daily upload times of this DB. The error message returned because the number of data migration tasks that are performed on a single database on the day exceeds 20.
400 ConcurrentTaskExceeded Concurrent task exceeding the allowed amount. The error message returned because the number of data migration tasks that are performed on a single database on the day exceeds 500.
400 IncorrectFileExtension The file extension does not support. The error message returned because the file name extensions of the backup files are invalid.
400 InvalidOssUrl Specified oss url is not valid. The error message returned because the specified URL to download backup files from the OSS bucket is invalid.
400 BakFileSizeExceeded Exceeding the allowed bak file size. The error message returned because the total size of the backup files exceeds 3 TB.
400 FileSizeExceeded Exceeding the allowed file size of DB instance. The error message returned because the size of the data restored from the backup files exceeds the available storage of the RDS instance.

Related operations

Operation Description
Create a migration task Creates a migration task.
Open the database to which backup data is migrated Opens a database.
Query backup data migration tasks Queries the migration tasks.
Query the backup file details of a backup data migration task Queries the details about the backup data files that are uploaded to an OSS bucket.