This topic describes the parameters that you need to set when you use Data Online Migration to migrate data.
Alibaba Cloud OSS
- OSS EndpointThe OSS endpoint. The following table describes the valid formats of OSS endpoints.
No. Format Description 1 http://oss-cn-hangzhou.aliyuncs.com You can use this type of OSS endpoints to upload or download data over the Internet by using HTTP. 2 http://oss-cn-hangzhou-internal.aliyuncs.com You can use this type of OSS endpoints to upload or download data over the internal network by using HTTP. 3 https://oss-cn-hangzhou.aliyuncs.com You can use this type of OSS endpoints to upload or download data over the Internet by using HTTPS. 4 https://oss-cn-hangzhou-internal.aliyuncs.com You can use this type of OSS endpoints to upload or download data over the internal network by using HTTPS. For more information about OSS endpoints, see Regions and endpoints.
- OSS Bucket
The name of the OSS bucket. The prefix and the suffix of the bucket name cannot contain invalid characters such as spaces, line feeds, and tab keys.
- OSS Prefix
The directory in which the files that you want to migrate are located. The value cannot be a file name. The value cannot start with a forward slash (/) and must end with a forward slash (/). For example, the value can be
docs/
. - Access Key Id and Access Key Secret
The AccessKey pair that is used to access the OSS bucket. An AccessKey pair consists of an AccessKey ID and an AccessKey secret. You can use the AccessKey pair of an Alibaba Cloud account or a RAM user. You cannot use the AccessKey pair of a temporary user. For more information, see Resource access management. You cannot grant permissions on bills to RAM users.
If you want to use a RAM user to migrate data from OSS to another Alibaba Cloud service, you can use the following policy to grant the required permissions to the RAM user.If you want to use a RAM user to migrate data to OSS, you can use the following policy to grant the required permissions to the RAM user.You can create a RAM user and use the AccessKey pair of the RAM user for a data migration job. After the migration job is complete, you can delete the AccessKey pair.
- Data Size and File Count
The amount of data and the number of files that you want to migrate. Log on to the OSS console to view the amount of data and the number of the files in the specified bucket or in the specified directory.
AWS S3
- EndpointData Online Migration supports AWS S3 buckets that are deployed in the following AWS S3 regions. The specified AWS S3 endpoint can only be in one of the following formats. HTTPS and HTTP are supported.
Region Region ID Endpoint US East (Ohio) us-east-2 http://s3.us-east-2.amazonaws.com http://s3-us-east-2.amazonaws.com
US East (N. Virginia) us-east-1 http://s3.us-east-1.amazonaws.com http://s3-us-east-1.amazonaws.com
US West (N. California) us-west-1 http://s3.us-west-1.amazonaws.com http://s3-us-west-1.amazonaws.com
US West (Oregon) us-west-2 http://s3.us-west-2.amazonaws.com http://s3-us-west-2.amazonaws.com
Canada (Central) ca-central-1 http://s3.ca-central-1.amazonaws.com http://s3-ca-central-1.amazonaws.com
Asia Pacific (Seoul) ap-northeast-2 http://s3.ca-central-1.amazonaws.com http://s3-ca-central-1.amazonaws.com
Asia Pacific (Osaka-Local) ap-northeast-3 http://s3.ap-northeast-2.amazonaws.com http://s3-ap-northeast-2.amazonaws.com
Asia Pacific (Singapore) ap-southeast-1 http://s3.ap-southeast-1.amazonaws.com http://s3-ap-southeast-1.amazonaws.com
Asia Pacific (Sydney) ap-southeast-2 http://s3.ap-southeast-2.amazonaws.com http://s3-ap-southeast-2.amazonaws.com
Asia Pacific (Tokyo) ap-northeast-1 http://s3.ap-northeast-1.amazonaws.com http://s3-ap-northeast-1.amazonaws.com
China (Beijing) cn-north-1 http://s3.cn-north-1.amazonaws.com.cn China (Ningxia) cn-northwest-1 http://s3.cn-northwest-1.amazonaws.com.cn Europe (Frankfurt) eu-central-1 http://s3.eu-central-1.amazonaws.com http://s3-eu-central-1.amazonaws.com
Europe (Ireland) eu-west-1 http://s3.eu-west-1.amazonaws.com http://s3-eu-west-1.amazonaws.com
Europe (London) eu-west-2 http://s3.eu-west-2.amazonaws.com http://s3-eu-west-2.amazonaws.com
Europe (Paris) eu-west-3 http://s3.eu-west-3.amazonaws.com http://s3-eu-west-3.amazonaws.com
South America (São Paulo) sa-east-1 http://s3.sa-east-1.amazonaws.com http://s3-sa-east-1.amazonaws.com
- Bucket
The name of the AWS S3 bucket. The prefix and the suffix of the bucket name cannot contain invalid characters such as spaces, line feeds, and tab keys.
- OSS Prefix
The directory in which the files that you want to migrate are located. The value cannot be a file name. The value cannot start with a forward slash (/) and must end with a forward slash (/). For example, the value can be
docs/
. - Access Key Id and Secret Access Key
The access key that is used to access the AWS S3 bucket. An access key consists of an access key ID and an access key secret. Log on to the AWS Identity and Access Management (IAM) console, create an IAM user, use the
AmazonS3ReadOnlyAccess
policy to grant the required permissions to the IAM user, and then create an access key for the IAM user. After the migration job is complete, you can delete the IAM user. - Data Size and File Count
The amount of data and the number of files that you want to migrate. You must set the Data Size and File Count parameters based on the actual amount of data and the actual number of files that you want to migrate.
Azure Blob
- Storage Account, Key, and Connection Strings
Log on to the Microsoft Azure console. In the left-side navigation pane, click Storage accounts and click the storage account that you want to use. In the Settings section, click Access keys to view the information about the storage account, access key, and connection strings.
- Container
The name of the Azure Blob container.
- Prefix
The directory in which the files that you want to migrate are located. The value cannot be a file name. The value cannot start with a forward slash (/) and must end with a forward slash (/). For example, the value can be
docs/
. - Data Size and File Count
The amount of data and the number of files that you want to migrate. The following example describes how to obtain values for the parameters. Log on to the Microsoft Azure console. In the left-side navigation pane, choose . On the container-property page, click Calculate Size to obtain the information about the storage usage of the container.
Tencent Cloud COS
- Region
-
The abbreviation of the name of the region where the bucket is deployed. The following table describes the regions that are supported by COS API V4. For more information, see Regions.
Region Abbreviation Beijing Zone 1 tj Beijing bj Shanghai sh Guangzhou gz Chengdu cd Singapore sgp Hong Kong (China) hk Toronto ca Frankfurt ger - The abbreviation of the name of the region where the bucket is deployed. The following table describes the regions that are
supported by COS API V5. For more information, see OSS regions and endpoints.
- Region inside the Chinese mainland
Region Abbreviation Region inside the Chinese mainland Regions on Public Cloud Beijing Zone 1 ap-beijing-1 Beijing ap-beijing Nanjing ap-nanjing Shanghai ap-shanghai Guangzhou ap-guangzhou Chengdu ap-chengdu Chongqing ap-chongqing Regions on Financial Cloud South China (Shenzhen Finance) ap-shenzhen-fsi East China (Shanghai Finance) ap-shanghai-fsi North China (Beijing Finance) ap-beijing-fsi - Hong Kong (China) and regions outside China
Region Abbreviation Asia Pacific Regions on Public Cloud Hong Kong (China) ap-hongkong Singapore ap-singapore Mumbai ap-mumbai Jakarta ap-jakarta Seoul ap-seoul Bangkok ap-bangkok Tokyo ap-tokyo North America Silicon Valley (US Wes) na-siliconvalley Virginia na-ashburn Toronto na-toronto South America São Paulo sa-saopaulo Europe Frankfurt eu-frankfurt Moscow eu-moscow
- Region inside the Chinese mainland
-
- Bucket
The name of the COS bucket. A COS bucket name is in the
Custom bucket name-APPID
format. APPID is the identifier of your Tencent Cloud account. Example:tony-1234567890
. You can set the Bucket parameter totony
. - Prefix
The directory in which the files that you want to migrate are located. The value of this parameter must start and end with a forward slash (/). For example, the value can be
/docs/
. - APPID
The APPID of your Tencent Cloud account. To view the APPID, log on to the COS console and go to the Account Info page.
- Secret Id and Secret Key
The API key that is used to access the COS bucket. An API key consists of a SecretId and a SecretKey. To view or create an API key, log on to the COS console, choose . On the API Key Management page, you can view or create an API key. We recommend that you create an API key for the data migration job and delete the API key after the migration job is completed.
- Data Size and File Count
The amount of data and the number of files that you want to migrate. To view the amount of data and the number of files in the specified bucket, log on to the COS console. In the left-side navigation pane, choose . On the page that appears, you can view the amount of data and the number of objects in the COS bucket.
Qiniu Cloud KODO
- Endpoint
The endpoint of the bucket. Log on to the Qiniu Cloud console. In the Object Storage console, find and click the ID of the bucket from which you want to migrate data. On the tab for domain name management, you can bind domain names and view domain names.
The endpoints of buckets are in thehttp://<Domain name>
format. The following examples show the valid types of endpoints.http://oy4jki81y.bkt.clouddn.com http://78rets.com1.z0.glb.clouddn.com http://cartoon.u.qiniudn.com
Note Only the preceding three types of endpoints are supported. - Bucket
The name of the KODO bucket.
- Prefix
The directory in which the files that you want to migrate are located. The value cannot be a file name. The value cannot start with a forward slash (/) and must end with a forward slash (/). For example, the value can be
docs/
. - Access Key and Secret KeyThe Access Key and the Secret Key that are used to access the KODO bucket. To view the Access Key and the Secret Key, log on to the Qiniu Cloud console and choose .Note You must specify a valid Access Key and a valid Secret Key.
- Data Size and File Count
The amount of data and the number of files that you want to migrate. To view the amount of data and the number of files, log on to the Qiniu Cloud console, navigate to the Object Storage page, and then click the name of the bucket in which the data that you want to migrate is stored. On the page that appears, click the Content Management tab. On the Content Management tab, view the number of files and the amount of data stored in the bucket.
Baidu Cloud BOS
- EndpointThe endpoint. You can use Data Online Migration to migrate data from a BOS bucket that is deployed in the North China - Beijing, South China - Guangzhou, or East China - Suzhou region. The following table describes the valid endpoints.
Region Domain name Protocol Endpoint North China - Beijing bj.bcebos.com HTTP and HTTPS http://bj.bcebos.com South China - Guangzhou gz.bcebos.com http://gz.bcebos.com East China - Suzhou su.bcebos.com http://su.bcebos.com Log on to the Baidu Cloud console and choose Cloud Services > BOS. In the bucket list under Storage Management, click the name of the bucket in which the files that you want to migrate are stored. On the page that appears, you can view the region where the bucket is deployed.
- Prefix
The directory in which the files that you want to migrate are located. The value cannot be a file name. The value cannot start with a forward slash (/) and must end with a forward slash (/). For example, the value can be
docs/
. - Access Key ID and Secret Access Key
The Access Key ID and the Secret Access Key that are used to access the BOS bucket. To view the Access Key ID and the Secret Access Key of your Baidu Cloud account, log on to the BOS console and choose .
- Data Size and File Count
The amount of data and the number of files that you want to migrate. Log on to the Baidu Cloud console and choose Cloud Services > BOS. In the bucket list under Storage Management, click the name of the bucket in which the files that you want to migrate are stored. On the page that appears, you can view the region where the bucket is deployed. You must set the Data Size and File Count parameters based on the actual amount of data and the actual number of files that you want to migrate.
HTTP or HTTPS sources
- File PathThe path of the listing file that contains the URLs of the files that you want to migrate. Before you create a migration job, you must create a local file that contains the URLs of the files that you want to migrate as a listing file. The content in this file consists of two columns.
- The left-side column contains the HTTP or HTTPS URLs of the files that you want to migrate. Special characters in the URLs must be URL-encoded. Data Online Migration uses the GET method to download files and the HEAD method to obtain the metadata of the files from the specified HTTP or HTTPS URLs.
- The right-side column contains the paths of the files that you want to migrate in
the Folder name/File name format. After the specified files are migrated, the values
in this column are used as the names of the OSS objects that correspond to these files.
Separate values in the two columns with tab keys (
\t
).
\n
). The following examples show the format.http://127.0.0.1/docs/my.doc docs/my.doc http://127.0.0.1/pics/my.jpg pics/my.jpg http://127.0.0.1/exes/my.exe exes/my.exe
The ULR of the listing file is in the following format:
After the listing file is created, upload it to Alibaba Cloud OSS. Then, enter the OSS path of the listing file in the File Path field. Data Online Migration downloads the listing file and migrates files based on the specified URLs in the listing file. The OSS path of the listing file is in the oss://{Bucket}/{The name of the list file} format. The following section provides an example:oss://mybucket/httplist.txt
- List Access EndpointThe OSS endpoint. The following table describes the valid formats of OSS endpoints.
No. Format Description 1 http://oss-cn-hangzhou.aliyuncs.com You can use this type of OSS endpoints to upload or download data over the Internet by using HTTP. 2 http://oss-cn-hangzhou-internal.aliyuncs.com You can use this type of OSS endpoints to upload or download data over the internal network by using HTTP. 3 https://oss-cn-hangzhou.aliyuncs.com You can use this type of OSS endpoints to upload or download data over the Internet by using HTTPS. 4 https://oss-cn-hangzhou-internal.aliyuncs.com You can use this type of OSS endpoints to upload or download data over the internal network by using HTTPS. - List Access AK and List Access SK
The AccessKey pair that is used to download the listing file. You can use an AccessKey pair of your Alibaba Cloud account or a RAM user. If you want to use the credentials of a RAM user, you must grant the RAM user the permission to call the GetObject operation.
- Data Size and File Count
The amount of data and the number of files that you want to migrate. You must set the Data Size and File Count parameters based on the actual amount of data and the actual number of files that you want to migrate.
USS
- Domain AddressThe USS endpoint that the OSS SDKs can access. You can use one of the endpoints that are listed in the following table to call the RESTful API of USS.Note If you do not know the network type of your USS service, use http://v0.api.upyun.com.
Network Type Endpoint Description Intelligent Routing http://v0.api.upyun.com Recommended China Telecom http://v1.api.upyun.com N/A China Unicom or China Netcom http://v2.api.upyun.com N/A China Mobile (Tietong) http://v3.api.upyun.com N/A - Service name
The name of your USS service. To view the service name, log on to the UPYUN console and navigate to the UPYUN Storage Service page.
- Migration Folder
The directory in which the files that you want to migrate are located. The value cannot start with a forward slash (/) and must end with a forward slash (/). For example, the value can be docs/.
- Operator Name and Operator Secret
The name of the operator. Log on to the UPYUN Cloud console, click your account name in the upper-right corner of the page, and select Account Management from the drop-down list. On the page that appears, you can view operators or add an operator. You can add an operator and specify a password for the data migration job. To allow the operator to migrate data, you must grant the operator the read permissions.
On the UPYUN Storage Service page, click the name of the USS service in which the files that you want to migrate are stored, choose , and then grant the operator the required permissions.
- Data Size and File Count
The amount of data and the number of files that you want to migrate. Log on to the UPYUN console and choose . On the page that appears, view the amount of data and the number of files. You must set the Data Size and File Count parameters based on the actual amount of data and the actual number of files that you want to migrate.
Kingsoft Cloud KS3
- EndpointThe following table describes the KS3 regions and endpoints.
Region Endpoint CN North 1(Beijing) ks3-cn-beijing.ksyun.com CN East 1(Shanghai) ks3-cn-shanghai.ksyun.com China (Hong Kong) ks3-cn-hk-1.ksyun.com Russia (Moscow) ks3-rus.ksyun.com Singapore ks3-sgp.ksyun.com Note To view the region where a bucket is deployed, log on to the KS3 console, and choose . - Bucket
The name of the KS3 bucket.
- Prefix
The directory in which the files that you want to migrate are located. The value cannot be a file name. The value cannot start with a forward slash (/) and must end with a forward slash (/). For example, the value can be
docs/
. - Access Key ID and Secret Key
The Access Key ID and the Secret Access Key that are used to access the KS3 bucket. To create an Access Key ID and a Secret Access Key, log on to the KS3 console, and choose . On the page that appears, create an IAM user, and use the
KS3FullAccess
policy to grant the IAM user the required permissions. On the IAM user details page, create an AccessKey ID and a Secret Access Key. You can delete the IAM user after the migration job is completed. - Data Size and File Count
The amount of data and the number of files that you want to migrate. Log on to the KS3 console and choose . On the page that appears, view the amount of data and the number of files that are stored in the bucket. You must set the Data Size and File Count parameters based on the actual amount of data and the actual number of files that you want to migrate.
Huawei Cloud OBS
- EndpointThe following table describes the regions and the corresponding OBS endpoints. For more information, see Regions and endpoints.
Region name Abbreviation Endpoint Protocol CN North-Beijing4 cn-north-4 obs.cn-north-4.myhuaweicloud.com HTTPS and HTTP CN North-Beijing1 cn-north-1 obs.cn-north-1.myhuaweicloud.com CN East-Shanghai2 cn-east-2 obs.cn-east-2.myhuaweicloud.com CN East-Shanghai1 cn-east-3 obs.cn-east-3.myhuaweicloud.com CN South-Guangzhou cn-south-1 obs.cn-south-1.myhuaweicloud.com CN Southwest-Guiyang1 cn-southwest-2 obs.cn-southwest-2.myhuaweicloud.com AP-Bangkok ap-southeast-2 obs.ap-southeast-2.myhuaweicloud.com AP-Hong Kong ap-southeast-1 obs.ap-southeast-1.myhuaweicloud.com AP-Singapore ap-southeast-3 obs.ap-southeast-3.myhuaweicloud.com AF-Johannesburg af-south-1 obs.af-south-1.myhuaweicloud.com Note To view the endpoint of the bucket in which the files that you want to migrate are stored, log on to the OBS console. On the Object Storage page, click the name of the bucket. On the Overview page, view the endpoint in the Basic Information section. - Bucket
The name of the OBS bucket.
- Prefix
The directory in which the files that you want to migrate are located. The value cannot be a file name. The value cannot start with a forward slash (/) and must end with a forward slash (/). For example, the value can be
docs/
. - Access Key ID and Secret Access Key
Log on to the Huawei Cloud console, move the pointer over your account name in the upper-right corner, and then select My Credentials from the drop-down list. In the left-side navigation pane, click Access Keys and click Create Access Key to create an access key.
- Data Size and File Count
Log on to the Huawei Cloud console, click the
icon in the upper-left corner of the page, choose , and then click the name of the bucket in which the files that you want to migrate are stored. In the left-side navigation pane, click Overview. In the Basic Statistics section, view the storage usage and the number of objects in the bucket.
UCloud UFile
- RegionThe region where the UCloud UFile bucket that stores the files to be migrated is deployed. The following table lists the regions that Data Transport supports. For more information, see Regions and zones.
Region Abbreviation China (Beijing 1) cn-bj1 China (Beijing 2) cn-bj2 China (Hong Kong) hk Guangzhou cn-gd China (Shanghai 2) cn-sh2 US (Los Angeles) us-ca Singapore (Singapore) sg Indonesia (Jakarta) idn-jakarta Nigeria (Lagos) afr-nigeria Brazil (São Paulo) bra-saopaulo UAE (Dubai) uae-dubai Vietnam (Ho Chi Minh City) vn-sng China (Taipei) tw-tp India (Mumbai) ind-mumbai US (Washington) us-ws Germany (Frankfurt) ge-fra - Bucket
The name of the UCloud UFile bucket.
- Prefix
The directory in which the files that you want to migrate are located. The value cannot be a file name. The value cannot start with a forward slash (/) and must end with a forward slash (/). For example, the value can be
docs/
. - Public Key and Private Key
The API public key and the API private key that are used to access the UCloud UFile bucket. Log on to the UCloud console and navigate to the API key management page to view the API public key and the API private key.
- Data Size and File Count
The amount of data and the number of files that you want to migrate. Log on to the UCloud console and navigate to the UFile page. On this page, click Bucket and then click the name of the bucket in which the files that you want to migrate are stored. On the Overview tab, view the storage usage of the bucket. You must set the Data Size and File Count parameters based on the actual amount of data and the actual number of files that you want to migrate.
GCP
This section describes the parameters that you need to configure when you migrate data from a Google Cloud Platform (GCP) bucket.
- Bucket
The name of the GCP bucket.
- Prefix
The directory in which the files that you want to migrate are located. The value cannot be a file name. The value cannot start with a forward slash (/) and must end with a forward slash (/). For example, the value can be docs/.
- Key File
- Log on to the GCP console.
- In the left-side navigation pane, click Service Accounts.
- On the Service Accounts page, find the service account. In the Actions column, move the pointer over the
icon and select .
- In the Create private key for <Username> dialog box, select JSON as the key type, and click CREATE.
- The system automatically downloads the JSON file of the key. In the Create Data Address panel, click the Click to Upload JSON File button next to the Key File parameter to upload the downloaded JSON file.
- Data Size and File Count
The amount of data and the number of files that you want to migrate. To ensure that the migration is efficient, you must set the Data Size and File Count parameters based on the actual amount of data and the actual number of files that you want to migrate. If you set the Prefix parameter to a directory, enter the actual amount of data and the actual number of files in the specified directory.