Creates a dataflow task.
Operation description
-
CPFS usage notes
- Only CPFS V2.2.0 and later support dataflows. You can view the version information on the file system details page in the console.
- Dataflow tasks are executed asynchronously. You can call the DescribeDataFlowTasks operation to query the task execution status. The task duration depends on the amount of data to be imported and exported. If a large amount of data exists, we recommend that you create multiple tasks.
- You can create a dataflow task only for a dataflow that is in the Running state.
- When you manually run a dataflow task, the automatic data update task for the dataflow is interrupted and enters the pending state.
- When you create an export task, make sure that the total length of the absolute path of the files to be exported from a CPFS file system does not exceed 1,023 characters.
-
CPFS for Lingjun usage notes
- Only CPFS for Lingjun V2.4.0 and later support dataflow. You can view the version information on the file system details page in the console.
- Dataflow tasks are executed asynchronously. You can call the DescribeDataFlowTasks operation to query the task execution status. The task duration depends on the amount of data to be imported and exported. If a large amount of data exists, we recommend that you create multiple tasks.
- You can create a dataflow task only for a dataflow that is in the Running state.
- When you create an export task, make sure that the total length of the absolute path of the files to be exported from a CPFS for Lingjun file system does not exceed 1,023 characters.
- CPFS for Lingjun supports two types of tasks: batch tasks and streaming tasks. For more information, see Task types.
Debugging
Authorization information
The following table shows the authorization information corresponding to the API. The authorization information can be used in the Action policy element to grant a RAM user or RAM role the permissions to call this API operation. Description:
- Operation: the value that you can use in the Action element to specify the operation on a resource.
- Access level: the access level of each operation. The levels are read, write, and list.
- Resource type: the type of the resource on which you can authorize the RAM user or the RAM role to perform the operation. Take note of the following items:
- For mandatory resource types, indicate with a prefix of * .
- If the permissions cannot be granted at the resource level,
All Resourcesis used in the Resource type column of the operation.
- Condition Key: the condition key that is defined by the cloud service.
- Associated operation: other operations that the RAM user or the RAM role must have permissions to perform to complete the operation. To complete the operation, the RAM user or the RAM role must have the permissions to perform the associated operations.
| Operation | Access level | Resource type | Condition key | Associated operation |
|---|---|---|---|---|
| nas:CreateDataFlowTask | create | *DataFlow acs:nas:{#regionId}:{#accountId}:filesystem/{#filesystemId} |
| none |
Request parameters
| Parameter | Type | Required | Description | Example |
|---|---|---|---|---|
| FileSystemId | string | Yes | The ID of the file system.
| bmcpfs-290w65p03ok64ya**** |
| DataFlowId | string | Yes | The ID of the dataflow. | df-194433a5be31**** |
| SrcTaskId | string | No | If you specify SrcTaskId, you must enter the ID of the dataflow task. The system copies the TaskAction, DataType, and EntryList parameters from the destination dataflow task. You do not need to specify them. Note
Streaming dataflow tasks are not supported.
| task-27aa8e890f45**** |
| TaskAction | string | No | Select the type of the dataflow task. Valid value:
Note
CPFS for Lingjun supports only Import, Export, StreamImport, and StreamExport. Only CPFS for Lingjun V2.6.0 and later support StreamImport and StreamExport.
| Import |
| DataType | string | No | The type of data on which operations are performed by the dataflow task. Valid value:
| Metadata |
| Directory | string | No | The source directory of the data. Limits:
Note
Only CPFS for Lingjun V2.6.0 and later support StreamImport and StreamExport.
| /path_in_cpfs/ |
| EntryList | string | No | The list of files that are executed by the dataflow task. Limits:
| ["/path_in_cpfs/file1", "/path_in_cpfs/file2"] |
| DryRun | boolean | No | Specifies whether to perform a dry run. During the dry run, the system checks whether the request parameters are valid and whether the requested resources are available. During the dry run, no dataflow task is created and no fee is incurred. Valid value:
| false |
| ClientToken | string | No | The client token that is used to ensure the idempotence of the request. You can use the client to generate the token, but you must make sure that the token is unique among different requests. The token can contain only ASCII characters and cannot exceed 64 characters in length. For more information, see How to ensure idempotence. Note
If you do not specify this parameter, the system automatically uses the request ID as the client token. The request ID may be different for each request.
| 123e4567-e89b-12d3-a456-42665544**** |
| ConflictPolicy | string | No | The conflict policy for files with the same name. Valid value:
Note
This parameter is required for CPFS for Lingjun file systems.
| SKIP_THE_FILE |
| DstDirectory | string | No | The directory mapped to the dataflow task. Limits:
Note
Only CPFS for Lingjun V2.6.0 and later support StreamImport and StreamExport.
| /path_in_cpfs/ |
| CreateDirIfNotExist | boolean | No | Specifies whether to automatically create a directory if no directory exists. Valid value:
Note
| false |
| Includes | string | No | Filters subdirectories and transfers their contents. Note
| ["/test/","/test1/"] |
| TransferFileListPath | string | No | Specify the OSS directory and synchronize data based on the content of the CSV file in the OSS directory. Requirements:
Note
| /test_oss_path/ |
Response parameters
Examples
Sample success responses
JSONformat
{
"RequestId": "2D69A58F-345C-4FDE-88E4-BF518948****",
"TaskId": "task-38aa8e890f45****"
}Error codes
| HTTP status code | Error code | Error message | Description |
|---|---|---|---|
| 400 | IllegalCharacters | The parameter contains illegal characters. | The parameter contains illegal characters. |
| 400 | MissingFileSystemId | FileSystemId is mandatory for this action. | - |
| 400 | MissingDataFlowId | DataFlowId is mandatory for this action. | - |
| 400 | InvalidFilesystemVersion.NotSupport | This Api does not support this fileSystem version. | This Api does not support this fileSystem version. |
| 403 | OperationDenied.InvalidState | The operation is not permitted when the status is processing. | The operation is not permitted when the status is processing. |
| 403 | OperationDenied.DependencyViolation | The operation is denied due to dependancy violation. | - |
| 403 | OperationDenied.DataFlowNotSupported | The operation is not supported. | - |
| 404 | InvalidParameter.InvalidFormat | The EntryList format is invalid. | - |
| 404 | InvalidParameter.SizeTooLarge | The specified EntryList size exceeds 64 KB. | - |
| 404 | InvalidDataFlow.NotFound | The specified data flow does not exist. | - |
| 404 | InvalidTaskAction.NotSupported | The task action is not supported. | - |
| 404 | InvalidTaskAction.PermissionDenied | The task action is not allowed. | - |
| 404 | InvalidSrcTaskId.NotFound | The SrcTaskId is not found. | - |
| 404 | InvalidDataType.NotSupported | The data type is not supported. | - |
| 404 | InvalidSrcTaskId.TaskIdInvalid | Source task ID is invalid. | - |
| 404 | InvalidSrcTaskId.TaskIdNotFound | Source task ID is not found. | - |
For a list of error codes, visit the Service error codes.
Change history
| Change time | Summary of changes | Operation |
|---|---|---|
| 2024-09-09 | The Error code has changed. The request parameters of the API has changed | View Change Details |
| 2024-02-29 | The Error code has changed. The request parameters of the API has changed | View Change Details |
