All Products
Search
Document Center

Data Online Migration:Migrate data

Last Updated:Jan 08, 2026

This topic describes the usage notes of, limits on, and procedure for data migration between local file systems.

Usage notes

When you migrate data by using Data Online Migration, take note of the following items:

  • When you create source and destination data addresses, you must set the value of the Directory To Be Migrated parameter to an absolute path. The path must start and end with a forward slash (/) and cannot contain environment variables or special characters.

  • When you create source and destination data addresses, make sure that you set the Directory To Be Migrated parameter to a path that exists and is valid.

  • A migration task occupies the resources at the source and destination data addresses. To ensure business continuity, we recommend that you enable throttling for your migration task or run your migration task during off-peak hours.

  • Before a migration task starts, Data Online Migration checks the files at the source and destination data addresses. If a file at the source data address and a file at the destination data address have the same name, and the Overwrite Mode parameter of the migration task is set to Overwrite All or Overwrite based on the last modification time, the file at the destination data address is overwritten during migration. If the two files contain different information and the file at the destination data address needs to be retained, we recommend that you change the name of one file or back up the file at the destination data address.

Limits

  • Character device files, block device files, socket files, and pipeline files at the source data address cannot be migrated.

  • After a hard link at the source data address is migrated to the destination data address, the hard link becomes a regular file.

  • If you enable directory migration, all the directories at the source data address are migrated to the destination data address. In this case, the specified overwrite mode of the migration task is ignored.

  • File permissions such as SUID, SGID, and SBID cannot be migrated.

  • Only specific attributes of data can be migrated between local file systems.

    • Attributes that can be migrated are ModifyTime, Permissions, Uid, and Gid.

      Note
      • The permissions include nine permissions such as read, write, and execute.

      • Uid indicates the user ID. Gid indicates the ID of the group to which the user belongs.

    • Attributes that cannot be migrated include but are not limited to AccessTime, ChangeTime, Attr, and Acl.

      Note

      Whether other attributes can be migrated is unknown. The actual migration results prevail.

Step 1: Select a region

  1. Log on to the Data Migration console as the Resource Access Management (RAM) user that you created for data migration.

  2. In the upper-left corner of the top navigation bar, select the region in which the agent that you want to use resides.

    image.png

    Important
    • The tunnels, agents, data addresses, and migration tasks that you create in a region cannot be used in another region. Select a region with caution.

    • We recommend that you select the region in which the agent resides. If the region in which the agent resides is not supported by Data Online Migration, select the region that is closest to the region in which the agent resides.

Step 2: Create a tunnel

  1. In the left-side navigation pane, choose Data Online Migration > Tunnel Management. On the Tunnel Management page, click Create Tunnel.

  2. In the Create Tunnel dialog box, configure the parameters and click OK. The following table describes the parameters.

    Parameter

    Required

    Description

    Name

    Yes

    The name of the tunnel.

    • The name cannot be empty and can be up to 100 characters in length.

    • The name can contain letters, digits, hyphens (-), and underscores (_).

    Maximum Bandwidth

    Yes

    The maximum bandwidth that the tunnel can use.

    • If you do not configure this parameter, the default value 0 is used, which indicates that the bandwidth for the tunnel is not limited.

    • If you configure this parameter, enter a value based on the note in the console.

    Important

    The bandwidth that is available for the tunnel depends on the actual bandwidth of the network connection.

    Requests/s

    Yes

    The maximum number of requests per second over the tunnel.

    • If you do not configure this parameter, the default value 0 is used, which indicates that the number of requests per second over the tunnel is not limited.

    • If you configure this parameter, enter a value based on the note in the console.

    Warning

    We recommend that you evaluate the capabilities of the storage system of the data source before you configure this parameter. If you set this parameter to a great value, your business is affected. We recommend that you enter a value based on the note in the console.

Note

For more information about tunnels, see Manage tunnels.

Step 3: Create an agent

  1. In the left-side navigation pane, choose Data Online Migration > Agent Management. On the Agent Management page, click New Agent.

  2. In the New Agent dialog box, configure the parameters and click OK. The following table describes the parameters.

    Parameter

    Required

    Description

    Name

    Yes

    The name of the agent.

    • The name cannot be empty and must be 3 to 63 characters in length.

      • It can contain lowercase English letters, digits, hyphens (-), and underscores (_). The name is case-sensitive.

      • The name must be in UTF-8 encoding and cannot start with a hyphen (-) or an underscore (_).

    Network Type

    Yes

    The network connection method for the agent. It includes the following two types:

    • VPC (recommended): The agent connects to the Data Online Migration service through a VPC. This method requires the machine where the agent is deployed to be able to access the internal same-region endpoint of the Data Online Migration service in the corresponding region. For example, if you use the migration service in the China (Beijing) region, the agent machine must be able to access the internal same-region endpoint {TunnelId}.cn-beijing.mgw-tc-internal.aliyuncs.com. Use an ECS instance in the same region as the Data Online Migration console to deploy the agent.

    • Internet: The agent connects to the Data Online Migration service over the Internet. This method requires the machine where the agent is deployed to be able to access the public endpoint of the Data Online Migration service in the corresponding region. For example, if you use the migration service in the China (Beijing) region, the agent machine must be able to access the public endpoint {TunnelId}.cn-beijing.mgw-tc.aliyuncs.com.

    Note
    • TunnelId indicates the tunnel ID.

    • You can use the ping command to test the network connectivity between the agent and the migration service.

    Deployment Method

    Yes

    The deployment method of the agent. Currently, only the Independent process mode is supported.

    Tunnel

    Yes

    The tunnel to which the agent belongs. An agent can be associated with only one tunnel. The bandwidth of the agent is affected by the total bandwidth of the tunnel.

    For example, a tunnel named tunnel-1 has a maximum bandwidth of 10 Gbit/s. tunnel-1 is associated with three agents: agent-1, agent-2, and agent-3. The total bandwidth of the three agents cannot exceed 10 Gbit/s. If you set the bandwidth of agent-1 to 3 Gbit/s, only 7 Gbit/s of bandwidth is available for agent-2 and agent-3. Plan and allocate bandwidth carefully.

  3. Generate the command used to deploy the agent. For more information, see the "Generate the command to deploy an agent" section of the Manage agents topic.

Note

For more information about agents, see Agent management.

Step 4: Create a source data address

  1. In the left-side navigation pane, choose Data Online Migration > Address Management. On the Address Management page, click Create Address.

  2. In the Create Address panel, configure the parameters and click OK. The following table describes the parameters.

    Parameter

    Required

    Description

    Name

    Yes

    The name of the source data address. The name must meet the following requirements:

    • The name is 3 to 63 characters in length.

    • The name must be case-sensitive and can contain lowercase letters, digits, hyphens (-), and underscores (_).

    • The name is encoded in the UTF-8 format and cannot start with a hyphen (-) or an underscore (_).

    Type

    Yes

    The type of the source data address. Select LocalFS.

    Directory To Be Migrated

    Yes

    The directory to be migrated. Enter the path of the directory in the field. You must enter an absolute path. The path must start and end with a forward slash (/) and cannot contain environment variables or special characters.

    For example, you set the prefix of the source data address to /example/src/, store a file named example.jpg in /example/src/, and set the prefix of the destination data address to /example/dest/. After the example.jpg file is migrated to the destination data address, the full path of the file is /example/dest/example.jpg.

    Important

    If a data address is associated with multiple agents, you must make sure that each agent can access the directory. Otherwise, specific data may not be migrated.

    Tunnel

    Yes

    The name of the tunnel that you want to use.

    Important
    • This parameter is required only when you migrate data to the cloud by using Express Connect circuits or VPN gateways or migrate data from self-managed databases to the cloud.

    • If data at the destination data address is stored in a local file system or you need to migrate data over an Express Connect circuit in an environment such as Alibaba Finance Cloud or Apsara Stack, you must create and deploy an agent.

    Agent

    Yes

    The name of the agent that you want to use.

    Important
    • This parameter is required only when you migrate data to the cloud by using Express Connect circuits or VPN gateways or migrate data from self-managed databases to the cloud.

    • You can select up to 200 agents at a time for a specific tunnel.

Step 5: Create a destination data address

  1. In the left-side navigation pane, choose Data Online Migration > Address Management. On the Address Management page, click Create Address.

  2. In the Create Address panel, configure the parameters and click OK. The following table describes the parameters.

    Parameter

    Required

    Description

    Name

    Yes

    The name of the destination data address. The name must meet the following requirements:

    • The name is 3 to 63 characters in length.

    • The name must be case-sensitive and can contain lowercase letters, digits, hyphens (-), and underscores (_).

    • The name is encoded in the UTF-8 format and cannot start with a hyphen (-) or an underscore (_).

    Type

    Yes

    The type of the destination data address. Select LocalFS.

    Directory To Be Migrated

    Yes

    The prefix of the destination data address. You can specify a prefix to migrate specific data.

    You must enter an absolute path. The path must start and end with a forward slash (/) and cannot contain environment variables or special characters.

    For example, you set the prefix of the source data address to /example/src/, store a file named example.jpg in /example/src/, and set the prefix of the destination data address to /example/dest/. After the example.jpg file is migrated to the destination data address, the full path of the file is /example/dest/example.jpg.

    Important

    If a data address is associated with multiple agents, you must make sure that each agent can access the directory. Otherwise, specific data may not be migrated.

    Tunnel

    Yes

    The name of the tunnel that you want to use.

    Important
    • This parameter is required only when you migrate data to the cloud by using Express Connect circuits or VPN gateways or migrate data from self-managed databases to the cloud.

    • If data at the destination data address is stored in a local file system or you need to migrate data over an Express Connect circuit in an environment such as Alibaba Finance Cloud or Apsara Stack, you must create and deploy an agent.

    Agent

    Yes

    The name of the agent that you want to use.

    Important
    • This parameter is required only when you migrate data to the cloud by using Express Connect circuits or VPN gateways or migrate data from self-managed databases to the cloud.

    • You can select up to 200 agents at a time for a specific tunnel.

Step 6: Create a migration task

  1. In the left-side navigation pane, choose Data Online Migration > Migration Tasks. On the Migration Tasks page, click Create Task.

  2. In the Select Address step,configure the parameters. The following table describes the parameters.

    Parameter

    Required

    Description

    Name

    Yes

    The name of the migration task. The name must meet the following requirements:

    • The name is 3 to 63 characters in length.

    • The name must be case-sensitive and can contain lowercase letters, digits, hyphens (-), and underscores (_).

    • The name is encoded in the UTF-8 format and cannot start with a hyphen (-) or an underscore (_).

    Source Address

    Yes

    The source data address that you created.

    Destination Address

    Yes

    The destination data address that you created.

  3. In the Task Configurations step, configure the parameters that are described in the following table.

    Parameter

    Required

    Description

    Migration Bandwidth

    No

    The maximum bandwidth that is available to the migration task. Valid values:

    • Default: Use the default upper limit for the migration bandwidth. The actual migration bandwidth depends on the file size and the number of files.

    • Specify an upper limit: Specify a custom upper limit for the migration bandwidth as prompted.

    Important
    • The actual migration speed depends on multiple factors, such as the source data address, network, throttling at the destination data address, and file size. Therefore, the actual migration speed may not reach the specified upper limit.

    • Specify a reasonable value for the upper limit of the migration bandwidth based on the evaluation of the source data address, migration purpose, business situation, and network bandwidth. Inappropriate throttling may affect business performance.

    Files Migrated Per Second

    No

    The maximum number of files that can be migrated per second. Valid values:

    • Default: Use the default upper limit for the number of files that can be migrated per second.

    • Specify an upper limit: Specify a custom upper limit as prompted for the number of files that can be migrated per second.

    Important
    • The actual migration speed depends on multiple factors, such as the source data address, network, throttling at the destination data address, and file size. Therefore, the actual migration speed may not reach the specified upper limit.

    • Specify a reasonable value for the upper limit of the migration bandwidth based on the evaluation of the source data address, migration purpose, business situation, and network bandwidth. Inappropriate throttling may affect business performance.

    Overwrite Mode

    No

    Specifies whether to overwrite a file at the destination data address if the file has the same name as a file at the source data address. Valid values:

    • Do not overwrite: does not migrate the file at the source data address.

    • Overwrite All: overwrites the file at the destination data address.

    • Overwrite based on the last modification time:

      • If the last modification time of the file at the source data address is later than that of the file at the destination data address, the file at the destination data address is overwritten.

      • If the last modification time of the file at the source data address is the same as that of the file at the destination data address, the file at the destination data address is overwritten if the files differ from one of the following aspects: size and Content-Type header.

    • Warning
      • If you select Overwrite based on the last modification time, there is no guarantee that newer files won’t be overwritten by older ones, which creates a risk of losing recent updates.

      • If you select Overwrite based on the last modification time, make sure that the file at the source data address contains information such as the last modification time, size, and Content-Type header. Otherwise, the overwrite policy may become invalid and unexpected migration results may occur.

      • If you select Do not overwrite or Overwrite based on the last modification time, the system sends a request to the source and destination data addresses to obtain the meta information and determines whether to overwrite a file. Therefore, request fees are generated for the source and destination data addresses.

    Migration Report

    Yes

    Specifies whether to push a migration report.

    • Do not push: does not push the migration report to the destination local file system. This is the default value.

    • Push: pushes the migration report to the destination file system. For more information, see What to do next.

    Important
    • The migration report occupies storage space at the destination data address.

    • The migration report may be pushed with a delay. Wait until the migration report is generated.

    • A unique ID is generated for each execution of a task. A migration report is pushed only once. Exercise caution when you delete the migration report.

    Migration Logs

    Yes

    Specifies whether to push migration logs to Simple Log Service (SLS). Valid values:

    • Do not push (default): Does not push migration logs.

    • Push: Pushes migration logs to SLS. View the migration logs in the SLS console.

    • Push only file error logs: Pushes only error migration logs to SLS. View the error migration logs in the SLS console.

    If you select Push or Push only file error logs, Data Online Migration creates a project in SLS. The name of the project is in the aliyun-oss-import-log-Alibaba Cloud account ID-Region of the Data Online Migration console format. Example: aliyun-oss-import-log-137918634953****-cn-hangzhou.

    Important

    To prevent errors in the migration task, make sure that the following requirements are met before you select Push or Push only file error logs:

    • SLS is activated.

    • You have confirmed the authorization on the Authorize page.

    Authorize

    No

    This parameter is displayed if you set the Migration Logs parameter to Push or Push only file error logs.

    Click Authorize to go to the Cloud Resource Access Authorization page. On this page, click Confirm Authorization Policy. The RAM role AliyunOSSImportSlsAuditRole is created and permissions are granted to the RAM role.

    File Name

    No

    The filter based on the file name.

    Both inclusion and exclusion rules are supported. However, only the syntax of specific regular expressions is supported. For more information about the syntax of regular expressions, visit re2. Example:

    • .*\.jpg$ indicates all files whose names end with .jpg.

    • By default, ^file.* indicates all files whose names start with file in the root directory.

      If a prefix is configured for the source data address and the prefix is data/to/oss/, you need to use the ^data/to/oss/file.* filter to match all files whose names start with file in the specified directory.

    • .*/picture/.* indicates files whose paths contain a subdirectory called picture.

    Important
    • If an inclusion rule is configured, all files that meet the inclusion rule are migrated. If multiple inclusion rules are configured, files are migrated as long as one of the inclusion rules is met.

      For example, the picture.jpg and picture.png files exist and the inclusion rule .*\.jpg$ is configured. Only the picture.jpg file is migrated. If the inclusion rule .*\.png$ is configured at the same time, both files are migrated.

    • If an exclusion rule is configured, all files that meet the exclusion rule are not migrated. If multiple exclusion rules are configured, files are not migrated as long as one of the exclusion rules is met.

      For example, the picture.jpg and picture.png files exist and the exclusion rule .*\.jpg$ is configured. Only the picture.png file is migrated. If the exclusion rule .*\.png$ is configured at the same time, neither file is migrated.

    • Exclusion rules take precedence over inclusion rules. If a file meets both an exclusion rule and an inclusion rule, the file is not migrated.

      For example, the file.txt file exists, and the exclusion rule .*\.txt$ and the inclusion rule file.* are configured. In this case, the file is not migrated.

    File Modification Time

    No

    The filter based on the last modification time of files.

    You can specify the last modification time as a filter rule. If you specify a time period, only the files whose last modification time is within the specified time period are migrated. Examples:

    • If you specify January 1, 2019 as the start time and do not specify the end time, only the files whose last modification time is not earlier than January 1, 2019 are migrated.

    • If you specify January 1, 2022 as the end time and do not specify the start time, only the files whose last modification time is not later than January 1, 2022 are migrated.

    • If you specify January 1, 2019 as the start time and January 1, 2022 as the end time, only the files whose last modification time is not earlier than January 1, 2019 and not later than January 1, 2022 are migrated.

    Whether to Migrate

    No

    Specifies whether to migrate special entities. If you select a check box, the corresponding special entities are migrated. If you clear a check box, the corresponding special entities are not migrated.

    Directory

    • If you select the check box, all the directories scanned at the source data address are to be migrated. In addition, the statistics about the directories are included in the values of the Files and Volume of Stored Data parameters for the migration task. Corresponding directories are created at the destination data address and configured with the attributes of the source directories. Only attributes that support migration are configured for the directories.

    • If you clear the check box, all the directories at the source data address are ignored. In addition, the statistics about directories are excluded from the values of the Files and Volume of Stored Data parameters for the migration task. For a directory scanned at the source data address:

      • If the directory does not contain files or all files in the directory are filtered out by the specified filter conditions, no corresponding directory is created at the destination data address.

      • If the directory contains files to be migrated, a corresponding directory is created at the destination data address to store the migrated files and is configured with default attribute values. The attributes of the source directory that support migration are not configured for the destination directory.

    symlink

    • If you select the check box, all the symbolic links at the source data address are to be migrated. In addition, the statistics about symbolic links are included in the values of the Files and Volume of Stored Data parameters for the migration task. Corresponding symbolic links are created at the destination data address and configured with the attributes of the source symbolic links. Only attributes that support migration are configured for the symbolic links. The value of the Target attribute of a destination symbolic link depends on the value of the Convert Destination Path Or Not parameter.

    • If you clear the check box, ll the symbolic links at the source data address are ignored. In addition, the statistics about symbolic links are excluded from the values of the Files and Volume of Stored Data parameters for the migration task.

    Important

    Whether you migrate or ignore the symbolic links at the source data address, the files and directories to which the symbolic links point are not migrated unless the files and directories are already included in the data to be migrated.

    Convert Destination Path Or Not

    No

    Specifies whether to convert the Target attribute value of the symbolic links at the source data address to ensure that the symbolic links at the destination data address can point to their objects as expected. If you select the check box, the Target attribute value is converted. If you clear the check box, the Target attribute value is not converted.

    Important
    • This parameter is available only if you select symlink for the Whether to Migrate parameter.

    • Regardless of whether the Target attribute value is converted or not, the system does not check whether the object to which a symbolic link points exists, whether the type of the object is valid, or whether the object can be accessed.

    If you enable this feature, the system checks the format of the Target attribute value of a source symbolic link.

    • If the value is a relative path, the value is not converted and is set as the Target attribute value of the corresponding destination symbolic link.

    • If the value is an absolute path, the value is first parsed into the shortest equivalent absolute path (AbsTarget) based on the source directory that stores the symbolic link. If AbsTarget contains the prefix of the source data address, the prefix is replaced with the prefix of the destination data address. The value after the replacement is set as the Target attribute value of the corresponding destination symbolic link.

    Note

    For example, the prefix of the source data address of a migration task is /mnt/nas1/, the prefix of the destination data address is /mnt/nas2/, and the symbolic link /mnt/nas1/links/a.lnk exists at the source data address. The following list describes the Target value of the corresponding destination symbolic link in different cases:

    • The Target attribute value of the source symbolic link is ../data/./a.txt. The value is not converted. The Target attribute value of the corresponding destination symbolic link is ../data/./a.txt.

    • The Target attribute value of the source symbolic link is /mnt/nas1/verbose/../data/./a.txt. The value is parsed into the shortest absolute path /mnt/nas1/data/a.txt. Then, /mnt/nas1/ in the shortest absolute path is replaced with /mnt/nas2/. Therefore, the Target attribute value of the corresponding destination symbolic link is /mnt/nas2/data/a.txt.

    • The Target attribute value of the source symbolic link is /root/outer/../data/./a.txt. The value is parsed into the shortest absolute path /root/data/a.txt. The shortest absolute path does not contain the prefix of the source data address. In this case, the Target attribute value of the corresponding destination symbolic link is /root/data/a.txt.

    If you disable this feature, Target attribute values are not converted. The Target attribute values of the source symbolic links are set as the Target attribute values of the corresponding destination symbolic links.

    Execution Time

    No

    Important
    1. If the current execution of a migration task is not complete by the next scheduled start time, the task starts its next execution at the subsequent scheduled start time after the current migration is complete. This process continues until the task is run the specified number of times.

    2. If Data Online Migration is deployed in the China (Hong Kong) region or the regions in the Chinese mainland, up to 10 concurrent migration tasks are supported. If Data Online Migration is deployed in regions outside China, up to five concurrent migration tasks are supported. If the number of concurrent tasks exceeds the limit, executions of tasks may not be complete as scheduled.

    The time when the migration task is run. Valid values:

    • Immediately: The task is immediately run.

    • Scheduled Task: The task is run within the specified time period every day. By default, the task is started at the specified start time and stopped at the specified stop time.

    • Periodic Scheduling: The task is run based on the execution frequency and number of execution times that you specify.

      • Execution Frequency: Specify the execution frequency of the task. Valid values: Every Hour, Every Day, Every Week, Certain Days of the Week, and Custom. For more information, see the Supported execution frequencies section of this topic.

      • Executions: Specify the maximum number of execution times of the task as prompted. By default, if you do not specify this parameter, the task is run once.

    Important

    You can manually start and stop tasks at any point in time. This is not affected by the custom execution time of tasks.

  4. Read and confirm the Data Online Migration Agreement. Then click Next.

  5. Verify that the configurations are correct and click OK. The migration task is created.

Supported execution frequencies

Execution frequency

Description

Example

Hourly

Runs the task every hour. You can combine this with the maximum number of executions.

The current time is 8:05. You set the frequency to hourly and the number of executions to 3. The first task starts at the next hour, 9:00.

  • If a task finishes before the next hour, the next task will start on the hour. This process repeats until the specified number of migrations is completed.

  • If the task does not finish by the next hour and ends at 12:30, the second task starts at the next hour, 13:00. This process repeats until the specified number of migrations are complete.

Daily

Runs the task daily at a specified hour (0–23). You can combine this with the maximum number of executions.

The current time is 8:05. You set the task to run daily at 10:00 for 5 executions. The first task starts at 10:00 today.

  • If the task finishes before 10:00 the next day, the second task starts at 10:00 the next day. This process repeats until the specified number of migrations are complete.

  • If the task does not finish by 10:00 the next day and ends at 12:05 the next day, the second task starts at 10:00 on the third day. This process repeats until the specified number of migrations are complete.

Weekly

Runs the task on a specific day of the week at a specified hour (0–23). You can combine this with the maximum number of executions.

The current time is Monday, 8:05. You set the task to run every Monday at 10:00 for 10 executions. The first task starts at 10:00 today.

  • If the task finishes before 10:00 next Monday, the second task starts at 10:00 next Monday. This process repeats until the specified number of migrations are complete.

  • If the task does not finish by 10:00 next Monday and ends at 12:05 next Monday, the second task starts at 10:00 on the following Monday. This process repeats until the specified number of migrations are complete.

Specific days of the week

Runs the task on specific days of the week at a specified hour (0–23).

The current time is Wednesday, 8:05. You set the task to run at 10:00 on Mondays, Wednesdays, and Fridays. The first task starts at 10:00 today.

  • If the task finishes before 10:00 on Friday, the second task starts at 10:00 on Friday. This process repeats until the specified number of migrations are complete.

  • If the task does not finish by 10:00 on Friday and ends at 12:05 next Monday, the second task starts at 10:00 next Wednesday. This process repeats until the specified number of migrations are complete.

Custom

Uses a cron expression to set a custom schedule for the task.

Note

A cron expression consists of 6 fields separated by spaces. The fields represent the execution schedule in the following order: second, minute, hour, day of the month, month, and day of the week.

The following are example cron expressions. For more information, see a cron expression generator.

  • 0 0 * * * *: Runs the task at 0 minutes and 0 seconds of every hour.

  • 0 0 0/1 * * ?: Runs the task every hour. The minimum interval is 1 hour.

  • 0 0 12 * * MON-FRI: Runs the task at 12:00 from Monday to Friday.

  • 0 30 8 1,15 * *: Runs the task at 8:30 on the 1st and 15th of each month.

Step 7: Verify data

Data Online Migration solely handles the migration of data and does not ensure data consistency or integrity. After a migration task is complete, you must review all the migrated data and verify the data consistency between the source and destination data addresses.

Warning

Make sure that you verify the migrated data at the destination data address after a migration task is complete. If you delete the data at the source data address before you verify the migrated data at the destination data address, you are liable for the losses and consequences caused by any data loss.