All Products
Search
Document Center

Cloud Parallel File Storage:DescribeDataFlowTasks

Last Updated:Dec 03, 2025

Queries the details of dataflow tasks.

Operation description

Only CPFS V2.2.0 and later and CPFS for Lingjun V2.4.0 and later support this operation. You can view the version information on the file system details page in the console.

Debugging

You can run this interface directly in OpenAPI Explorer, saving you the trouble of calculating signatures. After running successfully, OpenAPI Explorer can automatically generate SDK code samples.

Authorization information

The following table shows the authorization information corresponding to the API. The authorization information can be used in the Action policy element to grant a RAM user or RAM role the permissions to call this API operation. Description:

  • Operation: the value that you can use in the Action element to specify the operation on a resource.
  • Access level: the access level of each operation. The levels are read, write, and list.
  • Resource type: the type of the resource on which you can authorize the RAM user or the RAM role to perform the operation. Take note of the following items:
    • For mandatory resource types, indicate with a prefix of * .
    • If the permissions cannot be granted at the resource level, All Resources is used in the Resource type column of the operation.
  • Condition Key: the condition key that is defined by the cloud service.
  • Associated operation: other operations that the RAM user or the RAM role must have permissions to perform to complete the operation. To complete the operation, the RAM user or the RAM role must have the permissions to perform the associated operations.
OperationAccess levelResource typeCondition keyAssociated operation
nas:DescribeDataFlowTasksget
*DataFlow
acs:nas:{#regionId}:{#accountId}:filesystem/{#filesystemId}
    none
none

Request parameters

ParameterTypeRequiredDescriptionExample
FileSystemIdstringYes

The ID of the file system.

  • The IDs of CPFS file systems must start with cpfs-. Example: cpfs-099394bd928c****.
  • The IDs of CPFS for Lingjun file systems must start with bmcpfs-. Example: bmcpfs-290w65p03ok64ya****. .
cpfs-099394bd928c****
Filtersarray<object>No

The details about filters.

objectNo
KeystringNo

The filter name.

Valid value:

  • DataFlowIds: filters dataflow tasks by dataflow ID.
  • TaskIds: filters dataflow tasks by task ID.
  • Originator: filters dataflow tasks by task initiator.
  • TaskActions: filters dataflow tasks by task type.
  • DataTypes: filters dataflow tasks by data type.
  • Status: filters dataflow tasks by dataflow status.
  • CreateTimeBegin: filters dataflow tasks that are created after a specified time.
  • CreateTimeEnd: filters dataflow tasks that are created before a specified time.
  • StartTimeBegin: filters dataflow tasks that are started after a specified time.
  • StartTimeEnd: filters dataflow tasks that are started before a specified time.
  • EndTimeBegin: filters dataflow tasks that are stopped after a specified time.
  • EndTimeEnd: filters dataflow tasks that are stopped before a specified time.
DataFlowIds
ValuestringNo

The value of the filter. This parameter does not support wildcards.

  • If Key is set to DataFlowIds, set Value to a dataflow ID or a part of the dataflow ID. You can specify a dataflow ID or a group of dataflow IDs. You can specify a maximum of 10 dataflow IDs. Example: df-194433a5be31**** or df-194433a512a2****,df-234533a5be31****.
  • If Key is set to TaskId, set Value to a dataflow task ID or a part of the dataflow task ID. You can specify a dataflow task ID or a group of dataflow task IDs. You can specify a maximum of 10 dataflow task IDs. Example: task-38aa8e890f45**** or task-38aa8e890f45****,task-29ae8e890f45****.
  • If Key is set to TaskActions, set Value to the type of dataflow task. The task type can be Import, Export, Evict, Inventory, StreamImport, or StreamExport. Combined query is supported. CPFS for Lingjun supports only the Import, Export, StreamImport, and StreamExport tasks. Only CPFS for Lingjun V2.6.0 and later support the StreamImport and StreamExport tasks.
  • If Key is set to DataTypes, set Value to the data type of the dataflow task. The data type can be MetaAndData, Metadata, or Data. Combined query is supported.
  • If Key is set to Originator, set Value to the initiator of the dataflow task. The initiator can be User or System.
  • If Key is set to Status, set Value to the status of the dataflow task. The status can be Pending, Executing, Failed, Completed, Canceling, or Canceled. Combined query is supported.
  • If Key is set to CreateTimeBegin, set Value to the beginning of the time range to create the dataflow task. Time format: yyyy-MM-ddThh:mmZ.
  • If Key is set to CreateTimeEnd, set Value to the end of the time range to create the dataflow task. Time format: yyyy-MM-ddThh:mmZ.
  • If Key is set to StartTimeBegin, set Value to the beginning of the time range to start the dataflow task. Time format: yyyy-MM-ddThh:mmZ.
  • If Key is set to StartTimeEnd, set Value to the end of the time range to start the dataflow task. Time format: yyyy-MM-ddThh:mmZ.
  • If Key is set to EndTimeBegin, set Value to the beginning of the time range to stop the dataflow task. Time format: yyyy-MM-ddThh:mmZ.
  • If Key is set to EndTimeEnd, set Value to the end of the time range to stop the dataflow task. Time format: yyyy-MM-ddThh:mmZ.
dfid-12345678
NextTokenstringNo

The pagination token that is used in the next request to retrieve a new page of results. You do not need to specify this parameter for the first request. You must specify the token that is obtained from the previous query as the value of NextToken.

TGlzdFJlc291cmNlU****mVzJjE1MTI2NjY4NzY5MTAzOTEmMiZORnI4NDhVeEtrUT0=
MaxResultslongNo

The number of results for each query.

Valid values: 10 to 100.

Default value: 20.

20
WithReportsbooleanNo

Whether to query report information.

  • True (default)
  • False
Note
  • Set it to False to speed up the query.

  • Only CPFS for Lingjun supports this parameter.

True

Response parameters

ParameterTypeDescriptionExample
object
RequestIdstring

The request ID.

2D69A58F-345C-4FDE-88E4-BF518948****
NextTokenstring

A pagination token. It can be used in the next request to retrieve a new page of results.

TGlzdFJlc291cmNlU****mVzJjE1MTI2NjY4NzY5MTAzOTEmMiZORnI4NDhVeEtrUT0=
TaskInfoarray<object>

The information about dataflow tasks.

Taskobject
FilesystemIdstring

The ID of the file system.

cpfs-099394bd928c****
DataFlowIdstring

The ID of the dataflow.

dfid-194433a5be3****
TaskIdstring

The ID of the dataflow task.

taskId-12345678
SourceStoragestring

The access path of the source storage. Format: <storage type>://[<account id>:]<path>.

Among them:

  • storage type: Only Object Storage Service (OSS) is supported.

  • account id: the UID of the account of the source storage.

  • path: the name of the OSS bucket. Limits:

    • The name can contain only lowercase letters, digits, and hyphens (-). The name must start and end with a lowercase letter or digit.
    • The name can be up to 128 characters in length.
    • The name must be encoded in UTF-8.
Note
  • The OSS bucket must be an existing bucket in the region.

  • Only CPFS for Lingjun V2.6.0 and later support the account id parameter.

oss://bucket1
FileSystemPathstring

The directory of the fileset in the CPFS file system.

Limits:

  • The directory must be 2 to 1024 characters in length.
  • The directory must be encoded in UTF-8.
  • The directory must start and end with a forward slash (/).
  • The directory must be a fileset directory in the CPFS file system.
Note Only CPFS supports this parameter.
/a/b/c/
Originatorstring

The initiator of the dataflow task. The following information is displayed:

  • User: The task is initiated by a user.
  • System: The task is automatically initiated by CPFS based on the automatic update interval.
Note Only CPFS supports this parameter.
User
TaskActionstring

The type of the dataflow task. The following information is displayed:

  • Import: imports data stored in the source storage to a CPFS file system.
  • Export: exports specified data from a CPFS file system to the source storage.
  • StreamImport: imports the specified data from the source storage to a CPFS file system in streaming mode.
  • StreamExport: exports specified data from a CPFS file system to the source storage in streaming mode.
  • Evict: releases the data blocks of a file in a CPFS file system. After the eviction, only the metadata of the file is retained in the CPFS file system. You can still query the file. However, the data blocks of the file are cleared and do not occupy the storage space in the CPFS file system. When you access the file data, the file is loaded from the source storage as required.
  • Inventory: obtains the inventory list managed by a dataflow from the CPFS file system, providing the cache status of inventories in the dataflow.
Note Only CPFS for Lingjun V2.6.0 and later support StreamImport and StreamExport.
Import
DataTypestring

The type of data on which operations are performed by the dataflow task. The following information is displayed:

  • Metadata: the metadata of a file, including the timestamp, ownership, and permission information of the file. If you select Metadata, only the metadata of the file is imported. You can only query the file. When you access the file data, the file is loaded from the source storage as required.
  • Data: the data blocks of the file.
  • MetaAndData: the metadata and data blocks of the file.
Note CPFS for Lingjun supports only the MetaAndData type.
Metadata
Progresslong

The progress of the dataflow task. The number of operations that have been performed by the dataflow task.

240
Statusstring

The status of the dataflow task. The following information is displayed:

  • Pending: The dataflow task has been created and has not started.
  • Executing: The dataflow task is being executed.
  • Failed: The dataflow task failed to be executed. You can view the cause of the failure in the dataflow task report.
  • Completed: The dataflow task is completed. You can check that all the files have been correctly transferred in the dataflow task report.
  • Canceled: The dataflow task is canceled and is not completed.
  • Canceling: The dataflow task is being canceled.
Executing
ReportPathdeprecatedstring

The save path of dataflow task reports in the CPFS file system.

  • The task reports for a CPFS file system are generated in the .dataflow_report directory of the CPFS file system.
  • CPFS for Lingjun returns an OSS download link for you to download the task reports.
/path_in_cpfs/reportfile.cvs
CreateTimestring

The time when the task was created.

2021-08-04 18:27:35
StartTimestring

The start time of the task.

2021-08-04 18:27:35
EndTimestring

The end time of the task.

2021-08-04 18:27:35
FsPathstring

The path of the smart directory.

/aa/
ConflictPolicystring

The conflict policy for files with the same name. Valid values:

  • SKIP_THE_FILE: skips files with the same name.
  • KEEP_LATEST: compares the update time and keeps the latest version.
  • OVERWRITE_EXISTING: forcibly overwrites the existing file.
KEEP_LATEST
Directorystring

The directory in which the dataflow task is executed.

/path_in_cpfs/
DstDirectorystring

The directory mapped to the dataflow task.

/path_in_cpfs/
ErrorMsgstring

The cause of the task exception.

Note If this parameter is not returned or the return value is empty, no error occurs.
{"ErrorKey":"PATH_NOT_ACCESSIBLE","ErrorDetail":"lstat /cpfs/370lx1ev9ss27o****/test/abcdfnotfound: no such file or directory"}
ProgressStatsobject

The progress of the dataflow task.

FilesTotallong

The number of files scanned on the source.

3
FilesDonelong

The number of files (including skipped files) for which the dataflow task is complete.

3
ActualFileslong

The actual number of files for which the dataflow task is complete.

3
BytesTotallong

The amount of data scanned on the source. Unit: bytes.

131092971520
BytesDonelong

The amount of data (including skipped data) for which the dataflow task is complete. Unit: bytes.

131092971520
ActualByteslong

The actual amount of data for which the dataflow task is complete. Unit: bytes.

131092971520
RemainTimelong

The estimated remaining execution time. Unit: seconds.

437
AverageSpeedlong

The average flow velocity. Unit: bytes/s.

342279299
Reportsarray<object>

The reports.

Note
  • Streaming tasks do not support reports.

  • If the WithReport parameter is set to True, the CPFS for Lingjun report data is returned.

  • Only CPFS for Lingjun supports the WithReport parameter.

Reportobject
Namestring

The name of the report.

  • CPFS:

    TotalFilesReport: task reports.

  • CPFS for Lingjun:

    • FailedFilesReport: failed file reports.
    • SkippedFilesReport: skipped file reports.
    • SuccessFilesReport: successful file reports.
TotalFilesReport
Pathstring

The report URL.

https://a-hbr-temp-cn-hangzhou-staging.oss-cn-hangzhou.aliyuncs.com/temp/report/162319438359****/job-000bb6fwqficjbxk****/job-000bb6fwqficjbxk****_failed.zip?Expires=1721201422&OSSAccessKeyId=LTA****************&Signature=Fp%2BvauORTIVxooXY2tec6z0T%2Bp4%3D
Includesstring

Filters subdirectories and transfers their contents.

Note Only CPFS for Lingjun supports this operation.
["/test/","/test1/"]
TransferFileListPathstring

Specify the OSS directory and synchronize data based on the content of the CSV file in the OSS directory.

Note Only CPFS for Lingjun supports this operation.
/path_in_cpfs/

Examples

Sample success responses

JSONformat

{
  "RequestId": "2D69A58F-345C-4FDE-88E4-BF518948****",
  "NextToken": "TGlzdFJlc291cmNlU****mVzJjE1MTI2NjY4NzY5MTAzOTEmMiZORnI4NDhVeEtrUT0=",
  "TaskInfo": {
    "Task": [
      {
        "FilesystemId": "cpfs-099394bd928c****",
        "DataFlowId": "dfid-194433a5be3****",
        "TaskId": "taskId-12345678",
        "SourceStorage": "oss://bucket1",
        "FileSystemPath": "/a/b/c/",
        "Originator": "User",
        "TaskAction": "Import",
        "DataType": "Metadata",
        "Progress": 240,
        "Status": "Executing",
        "ReportPath": "/path_in_cpfs/reportfile.cvs",
        "CreateTime": "2021-08-04 18:27:35",
        "StartTime": "2021-08-04 18:27:35",
        "EndTime": "2021-08-04 18:27:35",
        "FsPath": "/aa/",
        "ConflictPolicy": "KEEP_LATEST",
        "Directory": "/path_in_cpfs/",
        "DstDirectory": "/path_in_cpfs/\n",
        "ErrorMsg": {
          "ErrorKey": "PATH_NOT_ACCESSIBLE",
          "ErrorDetail": "lstat /cpfs/370lx1ev9ss27o****/test/abcdfnotfound: no such file or directory"
        },
        "ProgressStats": {
          "FilesTotal": 3,
          "FilesDone": 3,
          "ActualFiles": 3,
          "BytesTotal": 131092971520,
          "BytesDone": 131092971520,
          "ActualBytes": 131092971520,
          "RemainTime": 437,
          "AverageSpeed": 342279299
        },
        "Reports": {
          "Report": [
            {
              "Name": "TotalFilesReport",
              "Path": "https://a-hbr-temp-cn-hangzhou-staging.oss-cn-hangzhou.aliyuncs.com/temp/report/162319438359****/job-000bb6fwqficjbxk****/job-000bb6fwqficjbxk****_failed.zip?Expires=1721201422&OSSAccessKeyId=LTA****************&Signature=Fp%2BvauORTIVxooXY2tec6z0T%2Bp4%3D"
            }
          ]
        },
        "Includes": [
          "/test/",
          "/test1/"
        ],
        "TransferFileListPath": "/path_in_cpfs/\n"
      }
    ]
  }
}

Error codes

HTTP status codeError codeError messageDescription
400IllegalCharactersThe parameter contains illegal characters.The parameter contains illegal characters.
400MissingFileSystemIdFileSystemId is mandatory for this action.-
400InvalidFilesystemVersion.NotSupportThis Api does not support this fileSystem version.This Api does not support this fileSystem version.
403OperationDenied.DataFlowNotSupportedThe operation is not supported.-
404InvalidParameter.InvalidNextTokenThe specified NextToken is invaild.-
404InvalidFileSystem.NotFoundThe specified file system does not exist.The specified file system does not exist.
404InvalidDataFlow.NotFoundThe specified data flow does not exist.-
404InvalidParameter.InvalidMaxResultsThe specified MaxResults is invalid.-
404InvalidFilterParamThe specified Filter.N.Key is invalid.-

For a list of error codes, visit the Service error codes.

Change history

Change timeSummary of changesOperation
2024-09-09The Error code has changed. The response structure of the API has changedView Change Details
2024-02-29The Error code has changed. The response structure of the API has changedView Change Details