All Products
Search
Document Center

MaxCompute:ListMmsJobs

Last Updated:Feb 13, 2026

Lists migration jobs.

Try it now

Try this API in OpenAPI Explorer, no manual signing needed. Successful calls auto-generate SDK code matching your parameters. Download it with built-in credential security for local usage.

Test

RAM authorization

The table below describes the authorization required to call this API. You can define it in a Resource Access Management (RAM) policy. The table's columns are detailed below:

  • Action: The actions can be used in the Action element of RAM permission policy statements to grant permissions to perform the operation.

  • API: The API that you can call to perform the action.

  • Access level: The predefined level of access granted for each API. Valid values: create, list, get, update, and delete.

  • Resource type: The type of the resource that supports authorization to perform the action. It indicates if the action supports resource-level permission. The specified resource must be compatible with the action. Otherwise, the policy will be ineffective.

    • For APIs with resource-level permissions, required resource types are marked with an asterisk (*). Specify the corresponding Alibaba Cloud Resource Name (ARN) in the Resource element of the policy.

    • For APIs without resource-level permissions, it is shown as All Resources. Use an asterisk (*) in the Resource element of the policy.

  • Condition key: The condition keys defined by the service. The key allows for granular control, applying to either actions alone or actions associated with specific resources. In addition to service-specific condition keys, Alibaba Cloud provides a set of common condition keys applicable across all RAM-supported services.

  • Dependent action: The dependent actions required to run the action. To complete the action, the RAM user or the RAM role must have the permissions to perform all dependent actions.

Action

Access level

Resource type

Condition key

Dependent action

odps:ListMmsJobs

list

*project

acs:odps:{#regionId}:{#accountId}:mmsdatasource/{#sourceId}

None None

Request syntax

GET /api/v1/mms/datasources/{sourceId}/jobs HTTP/1.1

Path Parameters

Parameter

Type

Required

Description

Example

sourceId

integer

Yes

The data source ID.

2000002

Request parameters

Parameter

Type

Required

Description

Example

name

string

No

The name of the data source.

demo

srcDbName

string

No

The name of the source database.

test_db_1

srcTableName

string

No

The name of the source table.

test_table_1

dstDbName

string

No

The destination MaxCompute project.

mms_test

dstTableName

string

No

The destination MaxCompute table.

test_table_1

status

string

No

The status of the migration job.

Valid values:

  • INIT :

    The migration has not started.

  • DONE :

    The migration is successful.

  • DOING :

    The migration is in progress.

  • FAILED :

    The migration failed.

DOING

stopped

integer

No

Indicates whether the job is stopped.

false

timerId

integer

No

The timer ID.

1

pageNum

integer

No

The number of the page to return.

1

pageSize

integer

No

The number of items to return per page.

Valid values:

  • asc :
    Ascending order.
  • desc :
    Descending order.

10

sorter.status

string

No

Sorting by status

Valid values:

  • asc :

    Ascending

  • desc :

    Descending

desc

Response elements

Element

Type

Description

Example

object

The returned data.

requestId

string

The request ID.

1112E7C7-C65F-57A2-A7C7-3B178AA257B6

data

object

The details of the returned data.

total

integer

The total number of records.

100

objectList

array<object>

The list of migration jobs.

array<object>

The migration job object.

id

integer

The migration job ID.

18

name

string

The name of the migration job.

migrate_db_1

sourceId

integer

The data source ID.

2000015

dbId

integer

The source database ID.

196

sourceName

string

The name of the data source.

demo

srcDbName

string

The name of the source database.

test_db_1

srcSchemaName

string

The source schema name. This is the schema in a Layer 3 namespace.

test_table_1

dstDbName

string

The destination MaxCompute project.

mms_test

dstSchemaName

string

The destination MaxCompute schema.

test_table_1

status

string

The status of the migration job.

Valid values:

  • INIT :

    The job has not started.

  • DONE :

    The job is complete.

  • DOING :

    The job is running.

  • FAILED :

    The job failed.

DONE

type

string

The migration scope. Valid values: Database, Tables, and Partitions.

Valid values:

  • Partitions :

    Migrates multiple partitions.

  • Database :

    Migrates a single database.

  • Tables :

    Migrates multiple tables.

Tables

taskNum

integer

The number of migration tasks in the job.

10

stopped

boolean

Indicates whether the migration job is stopped.

false

createTime

string

Indicates whether the migration job is stopped.

2024-12-17 15:44:17

taskDone

integer

The number of completed migration tasks.

10

config

object

The configuration of the migration job.

partitions

array

When type is set to Partitions, specify the partition IDs of the tables to migrate.

integer

The partition ID of a table to migrate.

[123, 132]

tables

array

When type is set to Tables, specify the names of the tables to migrate.

string

The name of a table to migrate.

["student", "scores"]

taskType

string

Deprecated. Valid values: MOCK, HIVE (hive udtf task), HIVE_DATAX (hive datax task), COPY_TASK (ODPS Copy Task), ODPS_INSERT_OVERWRITE (ODPS simple insert overwrite task), MC2MC_VERIFY, OSS, HIVE_OSS, HIVE_SPARK, and BIGQUERY.

BIGQUERY

tableBlackList

array

When type is set to Database, specify the tables to exclude from migration.

string

When type is set to Database, the table to exclude from migration.

["student", "scores"]

tableWhiteList

array

When type is set to Database, specify the tables to migrate. If you do not specify tableWhiteList, all tables in the specified database are migrated.

string

When type is set to Database, the table to migrate.

["student", "scores"]

partitionFilters

object

The partition filter expression. Specify the partition filter expression for each table.

string

The partition filter expression. Example: p1 >= '2022-03-04' and (p2 = 10 or p3 > 20) and p4 in ('abc', 'cde'). Notes:

p1, p2, and p3 are partition names.

Partition values are strings or numbers. Strings are enclosed in double or single quotes.

Only string values are allowed for partition columns other than INT and BIGINT types.

Comparison operators include the following: >, >=, =, <, <=, and <>.

The IN operator is supported.

Logical operators include AND and OR.

Use parentheses to group conditions.

{ "student": "p1 >= '2022-03-04' and (p2 = 10 or p3 > 20) and p4 in ('abc', 'cde')" }

schemaOnly

boolean

Deprecated.

false

tableMapping

object

The mapping from source table names to destination table names.

string

The mapping from a source table name to a destination table name.

{'a': 'a1'}

increment

boolean

Enable incremental migration. Only new or modified partitions are migrated. Modified partitions are re-migrated.

true

enableVerification

boolean

Enable verification. The current method runs SELECT COUNT on both the source and destination and compares the row counts.

true

tunnelQuota

string

Deprecated.

Depcreated

columnMapping

object

The mapping from source column names to destination column names.

string

The mapping from a source column name to a destination column name.

{"c-1": "c_1"}

others

object

Other configuration settings.

{"spark.executor.mem": "2g"}

eta

string

The estimated completion time of the migration. A smaller eta value increases the priority of the migration job.

2025-05-06

pageNum

integer

The page number.

1

pageSize

integer

The number of entries returned on each page.

10

Examples

Success response

JSON format

{
  "requestId": "1112E7C7-C65F-57A2-A7C7-3B178AA257B6",
  "data": {
    "total": 100,
    "objectList": [
      {
        "id": 18,
        "name": "migrate_db_1",
        "sourceId": 2000015,
        "dbId": 196,
        "sourceName": "demo",
        "srcDbName": "test_db_1",
        "srcSchemaName": "test_table_1",
        "dstDbName": "mms_test",
        "dstSchemaName": "test_table_1",
        "status": "DONE",
        "type": "Tables",
        "taskNum": 10,
        "stopped": false,
        "createTime": "2024-12-17 15:44:17\n",
        "taskDone": 10,
        "config": {
          "partitions": [
            0
          ],
          "tables": [
            "[\"student\", \"scores\"]"
          ],
          "taskType": "BIGQUERY",
          "tableBlackList": [
            "[\"student\", \"scores\"]"
          ],
          "tableWhiteList": [
            "[\"student\", \"scores\"]"
          ],
          "partitionFilters": {
            "key": "{\n\"student\": \"p1 >= '2022-03-04' and (p2 = 10 or p3 > 20) and p4 in ('abc', 'cde')\" \n}"
          },
          "schemaOnly": false,
          "tableMapping": {
            "key": "{'a': 'a1'}"
          },
          "increment": true,
          "enableVerification": true,
          "tunnelQuota": "Depcreated",
          "columnMapping": {
            "key": "{\"c-1\": \"c_1\"}"
          },
          "others": {
            "spark.executor.mem": "2g"
          }
        },
        "eta": "2025-05-06"
      }
    ],
    "pageNum": 1,
    "pageSize": 10
  }
}

Error codes

See Error Codes for a complete list.

Release notes

See Release Notes for a complete list.