All Products
Search
Document Center

DataWorks:GetIDEEventDetail

Last Updated:Apr 03, 2024

Queries the data snapshot of an extension point based on the ID of a message in DataWorks OpenEvent when the related extension point event is triggered.

Debugging

OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer. OpenAPI Explorer dynamically generates the sample code of the operation for different SDKs.

Request parameters

Parameter

Type

Required

Example

Description

Action

String

Yes

GetIDEEventDetail

The operation that you want to perform.

MessageId

String

Yes

8abcb91f-d266-4073-b907-2ed670378ed1

The message ID in DataWorks OpenEvent. You can obtain the ID from a received message when an extension point event is triggered.

ProjectId

Long

Yes

10000

The DataWorks workspace ID. You can obtain the ID from the message.

Response parameters

Parameter

Type

Example

Description

RequestId

String

8abcb91f-d266-4073-b907-2ed670378ed1

The request ID.

EventDetail

Object

The data snapshot that is generated when an extension point event is triggered.

The fields contained in data snapshots vary based on the types of the triggered extension point events. For more information, see the description of the fields.

FileExecutionCommand

Object

The data snapshot when the code in the file is run. This parameter is valid only if the message type is IDE_FILE_EXECUTE_BEFORE.

FileId

Long

1234123

The file ID.

DataSourceName

String

odps_source

The name of the data source with which the file is associated.

Content

String

SHOW TABLES;

The code in the file of the current version.

FileType

Long

10

The type of the code for the file. Valid values: 6 (Shell), 10 (ODPS SQL), 11 (ODPS MR), 23 (Data Integration), 24 (ODPS Script), 99 (zero load), 221 (PyODPS 2), 225 (ODPS Spark), 227 (EMR Hive), 228 (EMR Spark), 229 (EMR Spark SQL), 230 (EMR MR), 239 (OSS object inspection), 257 (EMR Shell), 258 (EMR Spark Shell), 259 (EMR Presto), 260 (EMR Impala), 900 (real-time synchronization), 1089 (cross-tenant collaboration), 1091 (Hologres development), 1093 (Hologres SQL), 1100 (assignment), and 1221 (PyODPS 3).

DeletedFile

Object

The data snapshot when the file is deleted. This parameter is valid only if the message type is IDE_FILE_DELETE_BEFORE.

Owner

String

7384234****

The file owner.

FileType

Long

10

The type of the code for the file. Valid values: 6 (Shell), 10 (ODPS SQL), 11 (ODPS MR), 23 (Data Integration), 24 (ODPS Script), 99 (zero load), 221 (PyODPS 2), 225 (ODPS Spark), 227 (EMR Hive), 228 (EMR Spark), 229 (EMR Spark SQL), 230 (EMR MR), 239 (OSS object inspection), 257 (EMR Shell), 258 (EMR Spark Shell), 259 (EMR Presto), 260 (EMR Impala), 900 (real-time synchronization), 1089 (cross-tenant collaboration), 1091 (Hologres development), 1093 (Hologres SQL), 1100 (assignment), and 1221 (PyODPS 3).

CurrentVersion

Long

1

The latest version number of the file.

BusinessId

Long

74328

The ID of the workflow to which the file belongs.

FileName

String

hello_dataworks.sql

The name of the file.

DataSourceName

String

odps_source

The name of the data source with which the file is associated.

UseType

String

NORMAL

The module to which the file belongs. Valid values:

  • NORMAL: The file is used for DataStudio.

  • MANUAL: The file is used for a manually triggered node.

  • MANUAL_BIZ: The file is used for a manually triggered workflow.

  • SKIP: The file is used for a dry-run DataStudio node.

  • ADHOCQUERY: The file is used for an ad hoc query.

  • COMPONENT: The file is used for a snippet.

FolderId

String

aldurie78l2falure

The ID of the folder to which the file belongs. You can call the GetFolder operation to query the details of the file based on the folder ID.

ParentFileId

Long

1234122

The ID of the do-while node or for-each node that corresponds to the file.

Content

String

SHOW TABLES;

The code in the file of the current version.

NodeId

Long

421429

The ID of the node that is scheduled.

FileId

Long

1234123

The file ID.

CommittedFile

Object

The data snapshot when the file is committed and deployed.

This parameter is valid only if the message type is IDE_FILE_SUBMIT_BEFORE or IDE_FILE_DEPLOY_BEFORE.

FileId

Long

1234123

The file ID.

Content

String

SHOW TABLES;

The code in the file of the current version.

Committor

String

7384234****

The ID of the Alibaba Cloud account that is used to generate the file of the current version.

FileType

Long

10

The type of the code for the file. Valid values: 6 (Shell), 10 (ODPS SQL), 11 (ODPS MR), 23 (Data Integration), 24 (ODPS Script), 99 (zero load), 221 (PyODPS 2), 225 (ODPS Spark), 227 (EMR Hive), 228 (EMR Spark), 229 (EMR Spark SQL), 230 (EMR MR), 239 (OSS object inspection), 257 (EMR Shell), 258 (EMR Spark Shell), 259 (EMR Presto), 260 (EMR Impala), 900 (real-time synchronization), 1089 (cross-tenant collaboration), 1091 (Hologres development), 1093 (Hologres SQL), 1100 (assignment), and 1221 (PyODPS 3).

ChangeType

String

UPDATE

The type of the change to the file of the current version. Valid values: CREATE, UPDATE, and DELETE.

FileName

String

hello_dataworks.sql

The name of the file.

NodeId

Long

421429

The ID of the node that is scheduled.

Comment

String

Second version

The description of the file version.

UseType

String

NORMAL

The module to which the file belongs. Valid values:

  • NORMAL: The file is used for DataStudio.

  • MANUAL: The file is used for a manually triggered node.

  • MANUAL_BIZ: The file is used for a manually triggered workflow.

  • SKIP: The file is used for a dry-run DataStudio node.

  • ADHOCQUERY: The file is used for an ad hoc query.

  • COMPONENT: The file is used for a snippet.

FilePropertyContent

Object

The details of the file.

DataSourceName

String

odps_source

The name of the data source with which the file is associated.

ParentFileId

Long

1234122

The ID of the do-while node or for-each node that corresponds to the file.

BusinessId

Long

74328

The ID of the workflow to which the file belongs.

CurrentVersion

Long

1

The latest version number of the file.

Owner

String

7384234****

The file owner.

FolderId

String

aldurie78l2falure

The ID of the folder to which the file belongs. You can call the GetFolder operation to query the details of the file based on the folder ID.

NodeConfiguration

Object

The scheduling properties of the node that corresponds to the file.

RerunMode

String

ALL_ALLOWED

Indicates whether the node that corresponds to the file can be rerun. Valid values:

  • ALL_ALLOWED: The node can be rerun regardless of whether it is successfully run or fails to run.

  • FAILURE_ALLOWED: The node can be rerun only after it fails to run.

  • ALL_DENIED: The node cannot be rerun regardless of whether it is successfully run or fails to run.

This parameter corresponds to the Rerun parameter in the Schedule section of the Properties tab in the DataWorks console.

SchedulerType

String

NORMAL

The scheduling type of the node. Valid values:

  • NORMAL: The node is an auto triggered node.

  • MANUAL: The node is a manually triggered node. Manually triggered nodes cannot be automatically triggered. They correspond to the nodes in the Manually Triggered Workflows pane.

  • PAUSE: The node is a paused node.

  • SKIP: The node is a dry-run node. Dry-run nodes are started as scheduled, but the system sets the status of the nodes to successful when it starts to run them.

ParaValue

String

a=x b=y

The scheduling parameters of the node.

This parameter corresponds to the Scheduling Parameter section of the Properties tab in the DataWorks console. For more information about the configurations of scheduling parameters, see Configure scheduling parameters.

CycleType

String

DAY

The type of the scheduling cycle of the node that corresponds to the file. Valid values: NOT_DAY and DAY. The value NOT_DAY indicates that the node is scheduled to run by minute or hour. The value DAY indicates that the node is scheduled to run by day, week, or month.

This parameter corresponds to the Scheduling Cycle parameter in the Schedule section of the Properties tab in the DataWorks console.

DependentNodeIdList

String

5,10,15,20

The ID of the node on which the node that corresponds to the file depends when the DependentType parameter is set to USER_DEFINE. Multiple IDs are separated by commas (,).

The value of this parameter is equivalent to the ID of the node that you specified after you select Previous Cycle and set Depend On to Other Nodes in the Dependencies section of the Properties tab in the DataWorks console.

ResourceGroupId

Long

375827434852437

The ID of the resource group that is used to run the node that corresponds to the file. You can call the ListResourceGroups operation to query the available resource groups in the workspace.

AutoRerunTimes

Long

3

The number of times that the node corresponding to the file can be rerun.

AutoRerunIntervalMillis

Long

120000

The interval at which the node corresponding to the file is rerun. Unit: milliseconds.

CronExpress

String

00 05 00 * * ?

The CRON expression that is used to schedule the node corresponding to the file.

InputList

Array of Input

The output names of the parent files on which the current file depends.

Input

String

dw_project_root

The output name of the parent file on which the current file depends.

This parameter corresponds to the Output Name of Ancestor Node parameter under Parent Nodes after Same Cycle is selected in the Dependencies section of the Properties tab in the DataWorks console.

ParseType

String

MANUAL

The mode of the configuration file dependency. Valid values:

  • MANUAL: Scheduling dependencies are manually configured.

  • AUTO: Scheduling dependencies are automatically parsed.

OutputList

Array of Output

The output names of the current file.

This parameter corresponds to the Output Name parameter under Output after Same Cycle is selected in the Dependencies section of the Properties tab in the DataWorks console.

RefTableName

String

ods_user_info_d

The output table name of the current file.

This parameter corresponds to the Output Table Name parameter under Output after Same Cycle is selected in the Dependencies section of the Properties tab in the DataWorks console.

Output

String

dw_project.002_out

The output name of the current file.

This parameter corresponds to the Output Name parameter under Output after Same Cycle is selected in the Dependencies section of the Properties tab in the DataWorks console.

DependentType

String

USER_DEFINE

The type of the cross-cycle scheduling dependency of the node. Valid values:

  • SELF: The instance generated for the node in the current cycle depends on the instance generated for the node in the previous cycle.

  • CHILD: The instance generated for the node in the current cycle depends on the instances generated for the descendant nodes at the nearest level of the node in the previous cycle.

  • USER_DEFINE: The instance generated for the node in the current cycle depends on the instances generated for one or more specified nodes in the previous cycle.

  • NONE: No cross-cycle scheduling dependency type is selected for the node.

TableModel

Object

The data snapshot when the table is committed and deployed. This parameter is valid only if the message type is IDE_TABLE_SUBMIT_BEFORE or IDE_TABLE_DEPLOY_BEFORE.

Env

String

DEV

The environment in which the table is used. Valid values:

  • DEV

  • PROD

LifeCycle

Long

7

The lifecycle of the table. Unit: day.

TableName

String

tb_hello

The name of the table.

DataSourceName

String

odps_source

The name of the data source to which the table belongs.

Columns

Array of Column

The columns in the table.

ColumnName

String

ID

The name of the column.

ColumnType

String

BIGINT

The data type of the column.

IsPartitionColumn

Boolean

false

Indicates whether the column is a partition key column. Valid values:

  • true

  • false

Comment

String

ID

The remarks of the column.

Comment

String

A new table

The remarks of the table.

Location

String

hdfs://path/to/object

The path of the table.

Examples

Sample requests

http(s)://[Endpoint]/?Action=GetIDEEventDetail
&MessageId=8abcb91f-d266-4073-b907-2ed670378ed1
&ProjectId=10000
&<Common request parameters>

Sample success responses

XML format

HTTP/1.1 200 OK
Content-Type:application/xml

<GetIDEEventDetailResponse>
    <RequestId>8abcb91f-d266-4073-b907-2ed670378ed1</RequestId>
    <EventDetail>
        <FileExecutionCommand>
            <FileId>1234123</FileId>
            <DataSourceName>odps_source</DataSourceName>
            <Content>SHOW TABLES;</Content>
            <FileType>10</FileType>
        </FileExecutionCommand>
        <DeletedFile>
            <Owner>7384234****</Owner>
            <FileType>10</FileType>
            <CurrentVersion>1</CurrentVersion>
            <BusinessId>74328</BusinessId>
            <FileName>hello_dataworks.sql</FileName>
            <DataSourceName>odps_source</DataSourceName>
            <UseType>NORMAL</UseType>
            <FolderId>aldurie78l2falure</FolderId>
            <ParentFileId>1234122</ParentFileId>
            <Content>SHOW TABLES;</Content>
            <NodeId>421429</NodeId>
            <FileId>1234123</FileId>
        </DeletedFile>
        <CommittedFile>
            <FileId>1234123</FileId>
            <Content>SHOW TABLES;</Content>
            <Committor>7384234****</Committor>
            <FileType>10</FileType>
            <ChangeType>UPDATE</ChangeType>
            <FileName>hello_dataworks.sql</FileName>
            <NodeId>421429</NodeId>
            <Comment>Second version</Comment>
            <UseType>NORMAL</UseType>
            <FilePropertyContent>
                <DataSourceName>odps_source</DataSourceName>
                <ParentFileId>1234122</ParentFileId>
                <BusinessId>74328</BusinessId>
                <CurrentVersion>1</CurrentVersion>
                <Owner>7384234****</Owner>
                <FolderId>aldurie78l2falure</FolderId>
            </FilePropertyContent>
            <NodeConfiguration>
                <RerunMode>ALL_ALLOWED</RerunMode>
                <SchedulerType>NORMAL</SchedulerType>
                <ParaValue>a=x b=y</ParaValue>
                <CycleType>DAY</CycleType>
                <DependentNodeIdList>5,10,15,20</DependentNodeIdList>
                <ResourceGroupId>375827434852437</ResourceGroupId>
                <AutoRerunTimes>3</AutoRerunTimes>
                <AutoRerunIntervalMillis>120000</AutoRerunIntervalMillis>
                <CronExpress>00 05 00 * * ?</CronExpress>
                <InputList>
                    <Input>dw_project_root</Input>
                    <ParseType>MANUAL</ParseType>
                </InputList>
                <OutputList>
                    <RefTableName>ods_user_info_d</RefTableName>
                    <Output>dw_project.002_out</Output>
                </OutputList>
                <DependentType>USER_DEFINE</DependentType>
            </NodeConfiguration>
        </CommittedFile>
        <TableModel>
            <Env>DEV</Env>
            <LifeCycle>7</LifeCycle>
            <TableName>tb_hello</TableName>
            <DataSourceName>odps_source</DataSourceName>
            <Columns>
                <ColumnName>ID</ColumnName>
                <ColumnType>BIGINT</ColumnType>
                <IsPartitionColumn>false</IsPartitionColumn>
                <Comment>ID</Comment>
            </Columns>
            <Comment>A new table </Comment>
            <Location>hdfs://path/to/object</Location>
        </TableModel>
    </EventDetail>
</GetIDEEventDetailResponse>

JSON format

HTTP/1.1 200 OK
Content-Type:application/json

{
  "RequestId" : "8abcb91f-d266-4073-b907-2ed670378ed1",
  "EventDetail" : {
    "FileExecutionCommand" : {
      "FileId" : 1234123,
      "DataSourceName" : "odps_source",
      "Content" : "SHOW TABLES;",
      "FileType" : 10
    },
    "DeletedFile" : {
      "Owner" : "7384234****",
      "FileType" : 10,
      "CurrentVersion" : 1,
      "BusinessId" : 74328,
      "FileName" : "hello_dataworks.sql",
      "DataSourceName" : "odps_source",
      "UseType" : "NORMAL",
      "FolderId" : "aldurie78l2falure",
      "ParentFileId" : 1234122,
      "Content" : "SHOW TABLES;",
      "NodeId" : 421429,
      "FileId" : 1234123
    },
    "CommittedFile" : {
      "FileId" : 1234123,
      "Content" : "SHOW TABLES;",
      "Committor" : "7384234****",
      "FileType" : 10,
      "ChangeType" : "UPDATE",
      "FileName" : "hello_dataworks.sql",
      "NodeId" : 421429,
      "Comment" : "Second version",
      "UseType" : "NORMAL",
      "FilePropertyContent" : {
        "DataSourceName" : "odps_source",
        "ParentFileId" : 1234122,
        "BusinessId" : 74328,
        "CurrentVersion" : 1,
        "Owner" : "7384234****",
        "FolderId" : "aldurie78l2falure"
      },
      "NodeConfiguration" : {
        "RerunMode" : "ALL_ALLOWED",
        "SchedulerType" : "NORMAL",
        "ParaValue" : "a=x b=y",
        "CycleType" : "DAY",
        "DependentNodeIdList" : "5,10,15,20",
        "ResourceGroupId" : 375827434852437,
        "AutoRerunTimes" : 3,
        "AutoRerunIntervalMillis" : 120000,
        "CronExpress" : "00 05 00 * * ?",
        "InputList" : {
          "Input" : "dw_project_root",
          "ParseType" : "MANUAL"
        },
        "OutputList" : {
          "RefTableName" : "ods_user_info_d",
          "Output" : "dw_project.002_out"
        },
        "DependentType" : "USER_DEFINE"
      }
    },
    "TableModel" : {
      "Env" : "DEV",
      "LifeCycle" : 7,
      "TableName" : "tb_hello",
      "DataSourceName" : "odps_source",
      "Columns" : {
        "ColumnName" : "ID",
        "ColumnType" : "BIGINT",
        "IsPartitionColumn" : false,
        "Comment" : "ID"
      },
      "Comment" : "A new table",
      "Location" : "hdfs://path/to/object"
    }
  }
}

Error codes

HTTP status code

Error code

Error message

Description

400

InternalError.UserId.Missing

An internal system error occurred. Try again later.

An internal error occurred. Try again later.

429

Throttling.Api

The request for this resource has exceeded your available limit.

The number of requests for the resource has exceeded the upper limit.

429

Throttling.System

The DataWorks system is busy. Try again later.

The DataWorks system is busy. Try again later.

429

Throttling.User

Your request is too frequent. Try again later.

Excessive requests have been submitted within a short period of time. Try again later.

500

InternalError.System

An internal system error occurred. Try again later.

An internal error occurred. Try again later.

For a list of error codes, see Service error codes.