All Products
Search
Document Center

DataWorks:ListDataQualityScans

Last Updated:Jan 12, 2026

Queries the list of data quality scan tasks in a project.

Operation description

This API operation is available for all DataWorks editions.

Debugging

You can run this interface directly in OpenAPI Explorer, saving you the trouble of calculating signatures. After running successfully, OpenAPI Explorer can automatically generate SDK code samples.

Authorization information

The following table shows the authorization information corresponding to the API. The authorization information can be used in the Action policy element to grant a RAM user or RAM role the permissions to call this API operation. Description:

  • Operation: the value that you can use in the Action element to specify the operation on a resource.
  • Access level: the access level of each operation. The levels are read, write, and list.
  • Resource type: the type of the resource on which you can authorize the RAM user or the RAM role to perform the operation. Take note of the following items:
    • For mandatory resource types, indicate with a prefix of * .
    • If the permissions cannot be granted at the resource level, All Resources is used in the Resource type column of the operation.
  • Condition Key: the condition key that is defined by the cloud service.
  • Associated operation: other operations that the RAM user or the RAM role must have permissions to perform to complete the operation. To complete the operation, the RAM user or the RAM role must have the permissions to perform the associated operations.
OperationAccess levelResource typeCondition keyAssociated operation
dataworks:ListDataQualityScanslist
*All Resources
*
    none
none

Request parameters

ParameterTypeRequiredDescriptionExample
NamestringNo

The data quality scan task name for fuzzy match.

test
SortBystringNo

The list of sorting fields. Supports fields such as last modified time and creation time. Format: "SortField+SortOrder (Desc/Asc)", where Asc is the default. Valid values:

  • ModifyTime (Desc/Asc)
  • CreateTime (Desc/Asc)
  • Id (Desc/Asc)
ModifyTime Desc
PageSizeintegerYes

The number of entries per page. Default value: 10.

10
PageNumberintegerYes

The page number. Default value: 1.

1
ProjectIdlongYes

The project ID.

10000
TablestringNo

Fuzzy match for the monitored table name.

video_album

Response parameters

ParameterTypeDescriptionExample
object

Schema of Response

RequestIdstring

The API request ID, which is generated as a UUID.

0bc14115***159376359
PageInfoobject

The page information.

PageNumberinteger

The page number.

1
PageSizeinteger

The number of records per page. Default value: 10.

10
TotalCountinteger

The total number of records returned.

1
DataQualityScansarray<object>

The list of data quality monitors.

DataQualityScanobject

Information about the data quality monitor.

ModifyTimelong

Last update time of the data quality monitor.

17236236472
Ownerstring

The user ID of the owner responsible for the data quality monitor.

23782382795249
Descriptionstring

The description of the data quality scan task. Maximum length: 65,535 characters.

This is a hourly run data quality evaluation plan.
ModifyUserstring

The user ID of the last person who updated the data quality monitor.

23782382795249
Parametersarray<object>

Execution parameter definitions for the data quality monitor.

Parameterobject

Execution parameter definitions for the data quality monitor.

Valuestring

The parameter value.

$[yyyy-mm-dd-1]
Namestring

The parameter name.

dt
CreateTimelong

The creation time of the data quality monitor.

1694512304000
ProjectIdlong

The project ID.

59094
CreateUserstring

The creator of the data quality monitor.

7892346529452
ComputeResourceobject

The compute engine used during execution. If it is not specified, the data source connection defined in the Spec will be used.

Runtimeobject

Additional runtime settings for the data quality monitor.

HiveConfstring

Additional parameters for the Hive engine. Currently, only mapreduce.job.queuename is supported to set the queue.

mapreduce.job.queuename=dq_queue
SparkConfstring

Additional parameters for the Spark engine. Currently, only spark.yarn.queue is supported to set the queue.

spark.yarn.queue=dq_queue
Enginestring

The engine type. These settings are only supported for the EMR compute engine. Valid values:

  • Hive: Hive SQL
  • Spark: Spark SQL
  • Kyuubi
Hive
EnvTypestring

Workspace environment of the compute engine. Valid values:

  • Prod
  • Dev
Prod
Namestring

The name of the computing engine. Uniquely identifies the engine.

emr_cluster_001
Namestring

The name of the data quality scan task. Can include digits, letters, Chinese characters, and both half-width and full-width punctuation marks. Maximum length: 255 characters.

Hourly partition quality monitoring
RuntimeResourceobject

The resource group used during the execution of the data quality monitor.

Cufloat

CU consumption for task running.

0.25
Idstring

The ID of the resource group.

Serverless_resource_group_xxxxx
Imagestring

The ID of the image configured for task running.

i-xxxxx
Triggerobject

Trigger settings for the data quality monitor.

Typestring

The trigger mode of the data quality monitor. Valid values:

  • ByManual: Manually triggered. Default setting.
  • BySchedule: Triggered by a scheduled task instance.
BySchedule
TaskIdsarray

If the trigger mode is BySchedule, the ID of the scheduling task that triggers the monitor must be configured.

TaskIdlong

The scheduling task ID.

1023777390
Hooksarray<object>

The hook configuration after the data quality monitor stops.

Hookobject

The hook configuration after the data quality monitor stops.

Conditionstring

The hook trigger condition. When this condition is met, the hook is triggered. Valid expression format:

Specifies multiple combinations of rule severity levels and rule validation statuses, such as results.any { r -> r.status == 'Fail' && r.rule.severity == 'Normal' || r.status == 'Error' && r.rule.severity == 'High' || r.status == 'Warn' && r.rule.severity == 'High' }. This means the hook is triggered if any executed rule has Fail with Normal severity, Error with High severity, or Warn with High severity. The severity values must match those defined in the Spec. The status values must match those in DataQualityResult.

results.any { r -> r.status == 'Fail' && r.rule.severity == 'Normal' || r.status == 'Error' && r.rule.severity == 'High' || r.status == 'Warn' && r.rule.severity == 'High' }
Typestring

The type of the hook. Valid values:

  • BlockTaskInstance: Blocks the scheduling of the task instance.
BlockTaskInstance
Idlong

The ID of the data quality monitor.

26433

Examples

Sample success responses

JSONformat

{
  "RequestId": "0bc14115***159376359",
  "PageInfo": {
    "PageNumber": 1,
    "PageSize": 10,
    "TotalCount": 1,
    "DataQualityScans": [
      {
        "ModifyTime": 17236236472,
        "Owner": 23782382795249,
        "Description": "This is a hourly run data quality evaluation plan.",
        "ModifyUser": 23782382795249,
        "Parameters": [
          {
            "Value": "$[yyyy-mm-dd-1]",
            "Name": "dt"
          }
        ],
        "CreateTime": 1694512304000,
        "ProjectId": 59094,
        "CreateUser": 7892346529452,
        "ComputeResource": {
          "Runtime": {
            "HiveConf": "mapreduce.job.queuename=dq_queue",
            "SparkConf": "spark.yarn.queue=dq_queue",
            "Engine": "Hive"
          },
          "EnvType": "Prod",
          "Name": "emr_cluster_001"
        },
        "Name": "Hourly partition quality monitoring",
        "RuntimeResource": {
          "Cu": 0.25,
          "Id": "Serverless_resource_group_xxxxx",
          "Image": "i-xxxxx"
        },
        "Trigger": {
          "Type": "BySchedule",
          "TaskIds": [
            1023777390
          ]
        },
        "Hooks": [
          {
            "Condition": "results.any { r -> r.status == 'Fail' && r.rule.severity == 'Normal' || r.status == 'Error' && r.rule.severity == 'High' || r.status == 'Warn' && r.rule.severity == 'High' }",
            "Type": "BlockTaskInstance"
          }
        ],
        "Id": 26433
      }
    ]
  }
}

Error codes

For a list of error codes, visit the Service error codes.