Queries the data snapshot of an extension point based on the ID of an open message when the related extension point event is triggered.

Debugging

OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer. OpenAPI Explorer dynamically generates the sample code of the operation for different SDKs.

Request parameters

Parameter Type Required Example Description
Action String Yes GetIDEEventDetail

The operation that you want to perform.

MessageId String Yes 8abcb91f-d266-4073-b907-2ed670378ed1

The ID of the message. You can obtain the ID from the received message when the extension point event is triggered.

ProjectId Long Yes 10000

The ID of the workspace. You can obtain the ID from the message.

RegionId String Yes cn-zhangjiakou

The ID of the region in which the DataWorks workspace resides. For example, the ID of the China (Shanghai) region is cn-shanghai, and that of the China (Zhangjiakou) region is cn-zhangjiakou. The system automatically determines the value of this parameter based on the endpoint used to call the operation.

Response parameters

Parameter Type Example Description
RequestId String 8abcb91f-d266-4073-b907-2ed670378ed1

The ID of the request. You can troubleshoot errors based on the ID.

EventDetail Object

The data snapshot that is generated when the extension point event is triggered.

The fields contained in data snapshots vary based on the types of the triggered extension point events. For more information, see the description of the fields.

FileExecutionCommand Object

The data snapshot when the code in the file is run. The value of this parameter is not empty only when the message type is IDE_FILE_EXECUTE_BEFORE.

FileId Long 1234123

The ID of the file.

DataSourceName String odps_first

The name of the compute engine instance with which the file is associated.

Content String SHOW TABLES;

The code in the file of the current version.

FileType Long 10

The type of the code in the file. Examples: 6 (Shell), 10 (ODPS SQL), 11 (ODPS MR), 23 (Data Integration), 24 (ODPS Script), 99 (zero load), 221 (PyODPS 2), 225 (ODPS Spark), 227 (EMR Hive), 228 (EMR Spark), 229 (EMR Spark SQL), 230 (EMR MR), 239 (OSS object inspection), 257 (EMR Shell), 258 (EMR Spark Shell), 259 (EMR Presto), 260 (EMR Impala), 900 (real-time sync), 1089 (cross-tenant collaboration), 1091 (Hologres development), 1093 (Hologres SQL), 1100 (assignment), and 1221 (PyODPS 3).

DeletedFile Object

The data snapshot when the file is deleted. The value of this parameter is not empty only when the message type is IDE_FILE_DELETE_BEFORE.

Owner String 7384234****

The owner of the file.

FileType Long 10

The type of the code in the file. Examples: 6 (Shell), 10 (ODPS SQL), 11 (ODPS MR), 23 (Data Integration), 24 (ODPS Script), 99 (zero load), 221 (PyODPS 2), 225 (ODPS Spark), 227 (EMR Hive), 228 (EMR Spark), 229 (EMR Spark SQL), 230 (EMR MR), 239 (OSS object inspection), 257 (EMR Shell), 258 (EMR Spark Shell), 259 (EMR Presto), 260 (EMR Impala), 900 (real-time sync), 1089 (cross-tenant collaboration), 1091 (Hologres development), 1093 (Hologres SQL), 1100 (assignment), and 1221 (PyODPS 3).

CurrentVersion Long 1

The latest version number of the file.

BusinessId Long 74328

The ID of the workflow to which the file belongs.

FileName String hello_dataworks.sql

The name of the file.

DataSourceName String odps_first

The name of the compute engine instance with which the file is associated.

UseType String NORMAL

The module to which the file belongs. Valid values:

  • NORMAL: The file is used for DataStudio.
  • MANUAL: The file is used for a manually triggered node.
  • MANUAL_BIZ: The file is used for a manually triggered workflow.
  • SKIP: The file is used for a dry-run DataStudio node.
  • ADHOCQUERY: The file is used for an ad hoc query.
  • COMPONENT: The file is used for a snippet.
FolderId String aldurie78l2falure

The ID of the folder to which the file belongs. You can call the GetFolder operation to query the details of the file based on the folder ID.

ParentFileId Long 1234122

The ID of the do-while node or for-each node that corresponds to the file.

Content String SHOW TABLES;

The code in the file of the current version.

NodeId Long 421429

The ID of the node that is scheduled.

FileId Long 1234123

The ID of the file.

CommittedFile Object

The data snapshot when the file is committed and deployed.

The value of this parameter is not empty only when the message type is IDE_FILE_SUBMIT_BEFORE or IDE_FILE_DEPLOY_BEFORE.

FileId Long 1234123

The ID of the file.

Content String SHOW TABLES;

The code in the file of the current version.

Committor String 7384234****

The ID of the Alibaba Cloud account that is used to create the file of the current version.

FileType Long 10

The type of the code in the file. Examples: 6 (Shell), 10 (ODPS SQL), 11 (ODPS MR), 23 (Data Integration), 24 (ODPS Script), 99 (zero load), 221 (PyODPS 2), 225 (ODPS Spark), 227 (EMR Hive), 228 (EMR Spark), 229 (EMR Spark SQL), 230 (EMR MR), 239 (OSS object inspection), 257 (EMR Shell), 258 (EMR Spark Shell), 259 (EMR Presto), 260 (EMR Impala), 900 (real-time sync), 1089 (cross-tenant collaboration), 1091 (Hologres development), 1093 (Hologres SQL), 1100 (assignment), and 1221 (PyODPS 3).

ChangeType String UPDATE

The type of the change to the file of the current version. Valid values: CREATE, UPDATE, and DELETE.

FileName String hello_dataworks.sql

The name of the file.

NodeId Long 421429

The ID of the node that is scheduled.

Comment String Second version

The description of the file version.

UseType String NORMAL

The module to which the file belongs. Valid values:

  • NORMAL: The file is used for DataStudio.
  • MANUAL: The file is used for a manually triggered node.
  • MANUAL_BIZ: The file is used for a manually triggered workflow.
  • SKIP: The file is used for a dry-run DataStudio node.
  • ADHOCQUERY: The file is used for an ad hoc query.
  • COMPONENT: The file is used for a snippet.
FilePropertyContent Object

The details of the file.

DataSourceName String odps_first

The name of the compute engine instance with which the file is associated.

ParentFileId Long 1234122

The ID of the do-while node or for-each node that corresponds to the file.

BusinessId Long 74328

The ID of the workflow to which the file belongs.

CurrentVersion Long 1

The latest version number of the file.

Owner String 7384234****

The owner of the file.

FolderId String aldurie78l2falure

The ID of the folder to which the file belongs. You can call the GetFolder operation to query the details of the file based on the folder ID.

NodeConfiguration Object

The scheduling properties of the node that corresponds to the file.

RerunMode String ALL_ALLOWED

Indicates whether the node can be rerun. Valid values:

  • ALL_ALLOWED: The node can be rerun regardless of whether it is successfully run or fails to run.
  • FAILURE_ALLOWED: The node can be rerun only after it fails to run.
  • ALL_DENIED: The node cannot be rerun regardless of whether it is successfully run or fails to run.

This parameter is equivalent to the Rerun parameter in the Schedule section of the Properties panel in the DataWorks console.

SchedulerType String NORMAL

The scheduling type of the node. Valid values:

  • NORMAL: The node is an auto triggered node.
  • MANUAL: The node is a manually triggered node. Manually triggered nodes cannot be automatically triggered. They correspond to the nodes in the Manually Triggered Workflows pane.
  • PAUSE: The node is a paused node.
  • SKIP: The node is a dry-run node. Dry-run nodes are started as scheduled but the system sets the status of the nodes to successful when it starts to run them.
ParaValue String a=x b=y

The scheduling parameters.

This parameter is equivalent to the configuration of the scheduling parameters in the Parameters section of the Properties panel in the DataWorks console. For more information, see Configure scheduling parameters.

CycleType String DAY

The type of the scheduling cycle of the node that corresponds to the file. Valid values: NOT_DAY and DAY. A value of NOT_DAY indicates that the node is scheduled to run by minute or hour. A value of DAY indicates that the node is scheduled to run by day, week, or month.

This parameter is equivalent to the Scheduling Cycle parameter in the Schedule section of the Properties panel in the DataWorks console.

DependentNodeIdList String 5,10,15,20

The IDs of the nodes on which the node corresponding to the file depends when the DependentType parameter is set to USER_DEFINE. Multiple IDs are separated by commas (,).

This parameter is equivalent to the field that appears after Previous Cycle is selected and the Depend On parameter is set to Other Nodes in the Dependencies section of the Properties panel in the DataWorks console.

ResourceGroupId Long 375827434852437

The ID of the resource group that is used to run the node that corresponds to the file. You can call the ListResourceGroups operation to query the available resource groups in the workspace.

AutoRerunTimes Long 3

The number of times that the node corresponding to the file can be rerun.

AutoRerunIntervalMillis Long 120000

The interval at which the node corresponding to the file is rerun. Unit: milliseconds.

CronExpress String 00 05 00 * * ?

The CRON expression that is used to schedule the node corresponding to the file.

InputList Array of Input

The output names of the parent files on which the current file depends.

Input String dw_project_root

The output names of the parent files on which the current file depends.

This parameter is equivalent to the Output Name parameter under Parent Nodes in the Dependencies section of the Properties panel in the DataWorks console.

ParseType String MANUAL

The mode of the configuration file dependency. Valid values:

  • MANUAL: The scheduling dependencies are manually configured.
  • AUTO: The scheduling dependencies are automatically parsed.
OutputList Array of Output

The output names of the current file.

This parameter is equivalent to the Output Name parameter under Output in the Dependencies section of the Properties panel in the DataWorks console.

RefTableName String ods_user_info_d

The output table name of the current file.

This parameter is equivalent to the Output Table Name parameter under Output in the Dependencies section of the Properties panel in the DataWorks console.

Output String dw_project.002_out

The output name of the current file.

This parameter is equivalent to the Output Name parameter under Output in the Dependencies section of the Properties panel in the DataWorks console.

DependentType String USER_DEFINE

The type of the cross-cycle scheduling dependency of the node that corresponds to the file. Valid values:

  • SELF: The instance generated for the node in the current cycle depends on the instance generated for the node in the previous cycle.
  • CHILD: The instance generated for the node in the current cycle depends on the instances generated for the descendant nodes at the nearest level of the node in the previous cycle.
  • USER_DEFINE: The instance generated for the node in the current cycle depends on the instances generated for one or more specified nodes in the previous cycle.
  • NONE: No cross-cycle scheduling dependency type is selected for the node.
TableModel Object

The data snapshot when the table is committed and deployed. The value of this parameter is not empty only when the message type is IDE_TABLE_SUBMIT_BEFORE or IDE_TABLE_DEPLOY_BEFORE.

Env String DEV

The environment in which the table is used. Valid values:

  • DEV: development environment
  • PROD: production environment
LifeCycle Long 7

The lifecycle of the table. Unit: days.

TableName String tb_hello

The name of the table.

DataSourceName String odps_first

The name of the compute engine instance to which the table belongs.

Columns Array of Column

The columns in the table.

ColumnName String ID

The name of the column.

ColumnType String BIGINT

The data type of the column.

IsPartitionColumn Boolean false

Indicates whether the column is a partition key column. Valid values:

  • true: The column is a partition key column.
  • false: The column is not a partition key column.
Comment String ID

The remarks of the column.

Comment String A new table

The description of the table.

Location String hdfs://path/to/object

The path of the table.

Examples

Sample requests

http(s)://[Endpoint]/?Action=GetIDEEventDetail
&MessageId=8abcb91f-d266-4073-b907-2ed670378ed1
&ProjectId=10000
&<Common request parameters>

Sample success responses

XML format

HTTP/1.1 200 OK
Content-Type:application/xml

<GetIDEEventDetailResponse>
    <RequestId>8abcb91f-d266-4073-b907-2ed670378ed1</RequestId>
    <EventDetail>
        <FileExecutionCommand>
            <FileId>1234123</FileId>
            <DataSourceName>odps_first</DataSourceName>
            <Content>SHOW TABLES;</Content>
            <FileType>10</FileType>
        </FileExecutionCommand>
        <DeletedFile>
            <Owner>7384234****</Owner>
            <FileType>10</FileType>
            <CurrentVersion>1</CurrentVersion>
            <BusinessId>74328</BusinessId>
            <FileName>hello_dataworks.sql</FileName>
            <DataSourceName>odps_first</DataSourceName>
            <UseType>NORMAL</UseType>
            <FolderId>aldurie78l2falure</FolderId>
            <ParentFileId>1234122</ParentFileId>
            <Content>SHOW TABLES;</Content>
            <NodeId>421429</NodeId>
            <FileId>1234123</FileId>
        </DeletedFile>
        <CommittedFile>
            <FileId>1234123</FileId>
            <Content>SHOW TABLES;</Content>
            <Committor>7384234****</Committor>
            <FileType>10</FileType>
            <ChangeType>UPDATE</ChangeType>
            <FileName>hello_dataworks.sql</FileName>
            <NodeId>421429</NodeId>
            <Comment>Second version</Comment>
            <UseType>NORMAL</UseType>
            <FilePropertyContent>
                <DataSourceName>odps_first</DataSourceName>
                <ParentFileId>1234122</ParentFileId>
                <BusinessId>74328</BusinessId>
                <CurrentVersion>1</CurrentVersion>
                <Owner>7384234****</Owner>
                <FolderId>aldurie78l2falure</FolderId>
            </FilePropertyContent>
            <NodeConfiguration>
                <RerunMode>ALL_ALLOWED</RerunMode>
                <SchedulerType>NORMAL</SchedulerType>
                <ParaValue>a=x b=y</ParaValue>
                <CycleType>DAY</CycleType>
                <DependentNodeIdList>5,10,15,20</DependentNodeIdList>
                <ResourceGroupId>375827434852437</ResourceGroupId>
                <AutoRerunTimes>3</AutoRerunTimes>
                <AutoRerunIntervalMillis>120000</AutoRerunIntervalMillis>
                <CronExpress>00 05 00 * * ?</CronExpress>
                <InputList>
                    <Input>dw_project_root</Input>
                    <ParseType>MANUAL</ParseType>
                </InputList>
                <OutputList>
                    <RefTableName>ods_user_info_d</RefTableName>
                    <Output>dw_project.002_out</Output>
                </OutputList>
                <DependentType>USER_DEFINE</DependentType>
            </NodeConfiguration>
        </CommittedFile>
        <TableModel>
            <Env>DEV</Env>
            <LifeCycle>7</LifeCycle>
            <TableName>tb_hello</TableName>
            <DataSourceName>odps_first</DataSourceName>
            <Columns>
                <ColumnName>ID</ColumnName>
                <ColumnType>BIGINT</ColumnType>
                <IsPartitionColumn>false</IsPartitionColumn>
                <Comment>ID</Comment>
            </Columns>
            <Comment>A new table </Comment>
            <Location>hdfs://path/to/object</Location>
        </TableModel>
    </EventDetail>
</GetIDEEventDetailResponse>

JSON format

HTTP/1.1 200 OK
Content-Type:application/json

{
  "RequestId" : "8abcb91f-d266-4073-b907-2ed670378ed1",
  "EventDetail" : {
    "FileExecutionCommand" : {
      "FileId" : 1234123,
      "DataSourceName" : "odps_first",
      "Content" : "SHOW TABLES;",
      "FileType" : 10
    },
    "DeletedFile" : {
      "Owner" : "7384234****",
      "FileType" : 10,
      "CurrentVersion" : 1,
      "BusinessId" : 74328,
      "FileName" : "hello_dataworks.sql",
      "DataSourceName" : "odps_first",
      "UseType" : "NORMAL",
      "FolderId" : "aldurie78l2falure",
      "ParentFileId" : 1234122,
      "Content" : "SHOW TABLES;",
      "NodeId" : 421429,
      "FileId" : 1234123
    },
    "CommittedFile" : {
      "FileId" : 1234123,
      "Content" : "SHOW TABLES;",
      "Committor" : "7384234****",
      "FileType" : 10,
      "ChangeType" : "UPDATE",
      "FileName" : "hello_dataworks.sql",
      "NodeId" : 421429,
      "Comment": "Second version",
      "UseType" : "NORMAL",
      "FilePropertyContent" : {
        "DataSourceName" : "odps_first",
        "ParentFileId" : 1234122,
        "BusinessId" : 74328,
        "CurrentVersion" : 1,
        "Owner" : "7384234****",
        "FolderId" : "aldurie78l2falure"
      },
      "NodeConfiguration" : {
        "RerunMode" : "ALL_ALLOWED",
        "SchedulerType" : "NORMAL",
        "ParaValue" : "a=x b=y",
        "CycleType" : "DAY",
        "DependentNodeIdList" : "5,10,15,20",
        "ResourceGroupId" : 375827434852437,
        "AutoRerunTimes" : 3,
        "AutoRerunIntervalMillis" : 120000,
        "CronExpress" : "00 05 00 * * ?",
        "InputList" : {
          "Input" : "dw_project_root",
          "ParseType" : "MANUAL"
        },
        "OutputList" : {
          "RefTableName" : "ods_user_info_d",
          "Output" : "dw_project.002_out"
        },
        "DependentType" : "USER_DEFINE"
      }
    },
    "TableModel" : {
      "Env" : "DEV",
      "LifeCycle" : 7,
      "TableName" : "tb_hello",
      "DataSourceName" : "odps_first",
      "Columns" : {
        "ColumnName" : "ID",
        "ColumnType" : "BIGINT",
        "IsPartitionColumn" : false,
        "Comment" : "ID"
      },
      "Comment" : "A new table",
      "Location" : "hdfs://path/to/object"
    }
  }
}

Error codes

Http status code Error code Error message Description
400 InternalError.UserId.Missing An internal system error occurred. Try again later. The error message returned because an internal error has occurred. Try again later.
403 Forbidden.Access Access is forbidden. Please first activate DataWorks Enterprise Edition or Flagship Edition. The error message returned because you are not allowed to perform this operation. Activate DataWorks Enterprise Edition or Ultimate Edition.
429 Throttling.Api The request for this resource has exceeded your available limit. The error message returned because the number of requests for the resource has exceeded the upper limit.
429 Throttling.System The DataWorks system is busy. Try again later. The error message returned because the DataWorks system is busy. Try again later.
429 Throttling.User Your request is too frequent. Try again later. The error message returned because excessive requests have been submitted within a short period of time. Try again later.
500 InternalError.System An internal system error occurred. Try again later. The error message returned because an internal error has occurred. Try again later.

For a list of error codes, visit the API Error Center.