Creates a pipeline. A pipeline consists of a series of nodes that form a directed acyclic graph (DAG), which defines the machine learning process.
Debugging
Authorization information
The following table shows the authorization information corresponding to the API. The authorization information can be used in the Action policy element to grant a RAM user or RAM role the permissions to call this API operation. Description:
- Operation: the value that you can use in the Action element to specify the operation on a resource.
- Access level: the access level of each operation. The levels are read, write, and list.
- Resource type: the type of the resource on which you can authorize the RAM user or the RAM role to perform the operation. Take note of the following items:
- For mandatory resource types, indicate with a prefix of * .
- If the permissions cannot be granted at the resource level,
All Resourcesis used in the Resource type column of the operation.
- Condition Key: the condition key that is defined by the cloud service.
- Associated operation: other operations that the RAM user or the RAM role must have permissions to perform to complete the operation. To complete the operation, the RAM user or the RAM role must have the permissions to perform the associated operations.
| Operation | Access level | Resource type | Condition key | Associated operation |
|---|---|---|---|---|
| paiflow:CreatePipeline | create | *All Resources * |
| none |
Request syntax
POST /api/v1/pipelines HTTP/1.1
Request parameters
| Parameter | Type | Required | Description | Example |
|---|---|---|---|---|
| body | object | No | The request body. | |
| WorkspaceId | string | Yes | The workspace ID. You can call ListWorkspaces to obtain the workspace ID. | 72*** |
| Manifest | string | Yes | The pipeline definition. For more information, see the sample pipeline definition. | apiVersion: "core/v1"********* |
Sample pipeline definition: The pipeline consists of a read table node data_source and a data type conversion node type_transform.
apiVersion: "core/v1"
metadata:
provider: "166233998075****"
version: "v1"
identifier: "my_pipeline"
name: "source-transform"
spec:
inputs:
parameters:
- name: "execution_maxcompute"
value:
spec:
endpoint: "http://service.cn.maxcompute.aliyun-inc.com/api"
odpsProject: "test_i****"
type: "Map"
pipelines:
- apiVersion: "core/v1"
metadata:
provider: "pai"
version: "v1"
identifier: "data_source"
name: "data-source"
displayName: "Read Table-1"
spec:
arguments:
parameters:
- name: "inputTableName"
value: "pai_online_project.wumai_data"
- name: "execution"
from: "{{inputs.parameters.execution_maxcompute}}"
- apiVersion: "core/v1"
metadata:
provider: "pai"
version: "v1"
identifier: "type_transform"
name: "type-transform"
displayName: "Data Type Conversion-1"
spec:
arguments:
artifacts:
- name: "inputTable"
from: "{{pipelines.data_source.outputs.artifacts.outputTable}}"
parameters:
- name: "cols_to_double"
value: "time,hour,pm2,pm10,so2,co,no2"
- name: "execution"
from: "{{inputs.parameters.execution_maxcompute}}"
dependencies:
- "data_source"
Key parameter configurations in the sample pipeline definition:
- provider: Replace the value of this parameter with your account ID.
- odpsProject: Replace the value of this parameter with the name of the MaxCompute resource that is associated with the workspace. For more information about how to obtain the name of the MaxCompute resource, see Manage workspaces.
Response parameters
Examples
Sample success responses
JSONformat
{
"RequestId": "DA869D1B-035A-43B2-ACC1-C56681******",
"PipelineId": "pipeline-hynm2bv8**********"
}Error codes
For a list of error codes, visit the Service error codes.
Change history
| Change time | Summary of changes | Operation |
|---|---|---|
| 2024-07-24 | The internal configuration of the API is changed, but the call is not affected | View Change Details |
