Retrieves the details of a single MaxCompute Migration Assist (MMA) migration plan.
Try it now
Test
RAM authorization
|
Action |
Access level |
Resource type |
Condition key |
Dependent action |
|
odps:GetMmsJob |
get |
*project
|
None | None |
Request syntax
GET /api/v1/mms/datasources/{sourceId}/jobs/{jobId} HTTP/1.1
Path Parameters
|
Parameter |
Type |
Required |
Description |
Example |
| sourceId |
integer |
Yes |
The data source ID. |
2000015 |
| jobId |
integer |
Yes |
The job ID. |
10 |
Request parameters
|
Parameter |
Type |
Required |
Description |
Example |
No parameters required.
Response elements
|
Element |
Type |
Description |
Example |
|
object |
ApiRes |
||
| requestId |
string |
The request ID. |
D9F872FD-5DDE-30A6-8C8A-1B8C6A81059F |
| data |
object |
The migration job object. |
|
| id |
integer |
The migration job ID. |
10 |
| name |
string |
The name of the migration job. |
migrate_db_1 |
| sourceId |
integer |
The data source ID. |
2 |
| dbId |
integer |
The source database ID. |
23 |
| sourceName |
string |
The name of the data source. |
demo |
| srcDbName |
string |
The name of the source database. |
mms_test |
| srcSchemaName |
string |
The name of the source schema. This parameter specifies the schema in a Layer 3 namespace. |
default |
| dstDbName |
string |
The destination MaxCompute project. |
mms_target |
| dstSchemaName |
string |
The destination MaxCompute schema. |
default |
| status |
string |
The status of the migration task. Valid values:
|
DOING |
| type |
string |
The migration scope. Valid values: Database, Tables, and Partitions. Valid values:
|
Tables |
| taskNum |
integer |
The number of migration tasks included in the job. |
100 |
| stopped |
boolean |
Stopped. |
false |
| createTime |
string |
The time when the job was created. |
2024-12-17 15:44:17 |
| taskDone |
integer |
The number of completed migration tasks. |
100 |
| config |
object |
The configuration of the migration job. |
|
| partitions |
array |
If type is set to Partitions, this parameter specifies the list of partition IDs of the table to migrate. |
|
|
integer |
The partition ID of the table to migrate. |
[123, 132] |
|
| tables |
array |
If type is set to Tables, this parameter specifies the list of names of the tables to migrate. |
|
|
string |
The name of the table to migrate. |
["student", "scores"] |
|
| taskType |
string |
Deprecated. Valid values: MOCK, HIVE (a Hive user-defined table-valued function (UDTF) task), HIVE_DATAX (a Hive DataX task), COPY_TASK (an ODPS Copy Task), ODPS_INSERT_OVERWRITE (an ODPS simple insert overwrite task), MC2MC_VERIFY, OSS, HIVE_OSS, HIVE_SPARK, and BIGQUERY. |
BIGQUERY |
| tableBlackList |
array |
If type is set to Database, this parameter specifies the tables to exclude from the migration. |
|
|
string |
If type is set to Database, this parameter specifies the table to exclude. |
["student", "scores"] |
|
| tableWhiteList |
array |
If type is set to Database, this parameter specifies the list of tables to migrate. If you do not specify this parameter, all tables in the database are migrated. |
|
|
string |
If type is set to Database, this parameter specifies the table to migrate. |
["student", "scores"] |
|
| partitionFilters |
object |
The partition filter expression. This parameter specifies the partition filter expression for a specific table. |
|
|
string |
The partition filter expression. Example: p1 >= '2022-03-04' and (p2 = 10 or p3 > 20) and p4 in ('abc', 'cde'). The following items describe the expression: p1, p2, and p3 are partition names. Partition values can be strings or numbers. Strings must be enclosed in double or single quotation marks. For partition key columns of data types other than INT and BIGINT, the partition values can only be strings. The following comparison operators are supported: >, >=, =, <, <=, and <>. The IN operator is supported in partition filter expressions. The following logical operators are supported: AND and OR. Parentheses are supported. |
{ "student": "p1 >= '2022-03-04' and (p2 = 10 or p3 > 20) and p4 in ('abc', 'cde')" } |
|
| schemaOnly |
boolean |
Deprecated |
false |
| tableMapping |
object |
The mapping from source table names to destination table names. |
|
|
string |
The mapping from source table names to destination table names. |
{'a': 'a1'} |
|
| increment |
boolean |
Incremental migration. Only new or modified partitions are migrated. Note: Modified partitions are re-migrated. |
true |
| enableVerification |
boolean |
Enables data verification. The current verification method is to execute a SELECT COUNT statement on the source and destination to compare the row counts. |
true |
| tunnelQuota |
string |
Deprecated |
Depcreated |
| columnMapping |
object |
{Source column name: Destination column name} |
|
|
string |
{Source column name: Destination column name} |
{"c-1": "c_1"} |
|
| others |
object |
Other configuration information. |
{"spark.executor.mem": "2g"} |
| eta |
string |
The expected completion time of the migration. Note: A smaller eta value indicates a higher priority for the migration task. |
2025-05-06 |
Examples
Success response
JSON format
{
"requestId": "D9F872FD-5DDE-30A6-8C8A-1B8C6A81059F",
"data": {
"id": 10,
"name": "migrate_db_1",
"sourceId": 2,
"dbId": 23,
"sourceName": "demo",
"srcDbName": "mms_test",
"srcSchemaName": "default",
"dstDbName": "mms_target",
"dstSchemaName": "default",
"status": "DOING",
"type": "Tables",
"taskNum": 100,
"stopped": true,
"createTime": "2024-12-17 15:44:17",
"taskDone": 100,
"config": {
"partitions": [
0
],
"tables": [
"[\"student\", \"scores\"]"
],
"taskType": "BIGQUERY",
"tableBlackList": [
"[\"student\", \"scores\"]"
],
"tableWhiteList": [
"[\"student\", \"scores\"]"
],
"partitionFilters": {
"key": "{\n\"student\": \"p1 >= '2022-03-04' and (p2 = 10 or p3 > 20) and p4 in ('abc', 'cde')\" \n}\n"
},
"schemaOnly": false,
"tableMapping": {
"key": "{'a': 'a1'}"
},
"increment": true,
"enableVerification": true,
"tunnelQuota": "Depcreated",
"columnMapping": {
"key": "{\"c-1\": \"c_1\"}"
},
"others": {
"spark.executor.mem": "2g"
}
},
"eta": "2025-05-06"
}
}
Error codes
See Error Codes for a complete list.
Release notes
See Release Notes for a complete list.