All Products
Search
Document Center

AnalyticDB:CreateStreamingJob

Last Updated:Jul 14, 2025
This topic is generated by a machine translation engine without any human intervention. ALIBABA CLOUD DOES NOT GUARANTEE THE ACCURACY OF MACHINE TRANSLATED CONTENT. To request a human-translated version of this topic or provide feedback on this translation, please include it in the feedback form.

Create External Data Source Configuration

Debugging

You can run this interface directly in OpenAPI Explorer, saving you the trouble of calculating signatures. After running successfully, OpenAPI Explorer can automatically generate SDK code samples.

Authorization information

The following table shows the authorization information corresponding to the API. The authorization information can be used in the Action policy element to grant a RAM user or RAM role the permissions to call this API operation. Description:

  • Operation: the value that you can use in the Action element to specify the operation on a resource.
  • Access level: the access level of each operation. The levels are read, write, and list.
  • Resource type: the type of the resource on which you can authorize the RAM user or the RAM role to perform the operation. Take note of the following items:
    • For mandatory resource types, indicate with a prefix of * .
    • If the permissions cannot be granted at the resource level, All Resources is used in the Resource type column of the operation.
  • Condition Key: the condition key that is defined by the cloud service.
  • Associated operation: other operations that the RAM user or the RAM role must have permissions to perform to complete the operation. To complete the operation, the RAM user or the RAM role must have the permissions to perform the associated operations.
OperationAccess levelResource typeCondition keyAssociated operation
gpdb:CreateStreamingJobcreate
*DBInstance
acs:gpdb:{#regionId}:{#accountId}:dbinstance/{#DBInstanceId}
    none
none

Request parameters

ParameterTypeRequiredDescriptionExample
RegionIdstringNo

Region ID.

Note You can call the DescribeRegions API to view available region IDs.
cn-beijing
DBInstanceIdstringYes

Instance ID.

gp-bp10g78o9807yv9h3
DataSourceIdstringYes

Data source ID.

1
JobDescriptionstringNo

Job description.

test-job
JobNamestringYes

Job name.

test-job
ModestringNo

Configuration mode

  1. Basic mode requires specifying some configuration fields
  2. Professional mode supports submitting a YAML file
Enumeration Value:
  • basic
  • professional
basic / professional
SrcColumnsarrayNo

Source data field list.

stringNo

Source data field key.

src_column_1
DestColumnsarrayNo

Target data table mapping field list.

stringNo

Mapping field key in the target table.

dest_column_1
AccountstringNo

Target database account.

test-account
PasswordstringNo

Target database password.

pwd123
DestDatabasestringNo

Target database name.

dest-db
DestSchemastringNo

Target namespace.

dest-schema
DestTablestringNo

Target table name.

dest-table
WriteModestringNo

Write mode.

Enumeration Value:
  • MERGE: MERGE.
  • INSERT: INSERT.
  • UPDATE: UPDATE.
INSERT/UPDATE/MERGE
TryRunbooleanNo

Whether to test the real-time task, values:

  • true

  • false

Default value: false.

true
JobConfigstringNo

Job configuration file, required for professional mode.

DATABASE: adbpgss_test USER: adbpgss_test PASSWORD: adbpgssTest HOST: gp-xxx-master.gpdb.rds-aliyun-pre.rds.aliyuncs.com PORT: 5432 KAFKA: INPUT: SOURCE: BROKERS: broker1:9092,broker2:9092,broker3:9092 TOPIC: testtopic FALLBACK_OFFSET: earliest KEY: COLUMNS: - NAME: customer_id TYPE: int FORMAT: delimited DELIMITED_OPTION: DELIMITER: '|' VALUE: COLUMNS: - TYPE: integer NAME: l_orderkey - TYPE: integer NAME: l_partkey - TYPE: integer NAME: l_suppkey - TYPE: integer NAME: l_linenumber - TYPE: decimal NAME: l_quantity - TYPE: decimal NAME: l_extendedprice - TYPE: decimal NAME: l_discount - TYPE: decimal NAME: l_tax - TYPE: char NAME: l_returnflag - TYPE: char NAME: l_linestatus - TYPE: date NAME: l_shipdate - TYPE: date NAME: l_commitdate - TYPE: date NAME: l_receiptdate - TYPE: text NAME: l_shipinstruct - TYPE: text NAME: l_shipmode - TYPE: text NAME: l_comment FORMAT: delimited DELIMITED_OPTION: DELIMITER: '|' ERROR_LIMIT: 10 OUTPUT: SCHEMA: adbpgss_test TABLE: write_with_insert_plaintext MODE: MERGE MATCH_COLUMNS: - l_orderkey - l_partkey - l_suppkey UPDATE_COLUMNS: - l_linenumber - l_quantity - l_extendedprice - l_discount - l_tax - l_returnflag - l_linestatus - l_shipdate - l_commitdate - l_receiptdate - l_shipinstruct - l_shipmode - l_comment MAPPING: - EXPRESSION: l_orderkey NAME: l_orderkey - EXPRESSION: l_partkey NAME: l_partkey - EXPRESSION: l_suppkey NAME: l_suppkey - EXPRESSION: l_linenumber NAME: l_linenumber - EXPRESSION: l_quantity NAME: l_quantity - EXPRESSION: l_extendedprice NAME: l_extendedprice - EXPRESSION: l_discount NAME: l_discount - EXPRESSION: l_tax NAME: l_tax - EXPRESSION: l_returnflag NAME: l_returnflag - EXPRESSION: l_linestatus NAME: l_linestatus - EXPRESSION: l_shipdate NAME: l_shipdate - EXPRESSION: l_commitdate NAME: l_commitdate - EXPRESSION: l_receiptdate NAME: l_receiptdate - EXPRESSION: l_shipinstruct NAME: l_shipinstruct - EXPRESSION: l_shipmode NAME: l_shipmode - EXPRESSION: l_comment NAME: l_comment COMMIT: MAX_ROW: 1000 MINIMAL_INTERVAL: 1000 CONSISTENCY: ATLEAST POLL: BATCHSIZE: 1000 TIMEOUT: 1000 PROPERTIES: group.id: testgroup
GroupNamestringNo

Kafka group name

group_name.
FallbackOffsetstringNo

FallbackOffset, fallback offset

  • The FallbackOffset parameter defines the behavior when the consumer does not request a specific offset or the requested offset exceeds the current Kafka cluster's recorded offset information. You can choose to start consuming from the earliest (newest) or latest (oldest) offset.
Enumeration Value:
  • EARLIEST: EARLIEST.
  • LATEST: LATEST.
EARLIEST / LATEST
MatchColumnsarrayNo

Match columns, usually all primary key columns of the target table. If all column values in this configuration are the same, the two rows of data are considered duplicates.

stringNo

Match field key.

column_1
UpdateColumnsarrayNo

Update columns, usually all non-primary key columns of the target table. When data is determined to be duplicate through MatchColumns, updating the UpdateColumns column values will result in new data overwriting old data.

stringNo

Update field key.

column_1
ErrorLimitCountlongNo

When data in Kafka does not match the ADBPG target table, it will cause a write failure. This value is the number of error rows allowed; exceeding this will cause the task to fail.

5
ConsistencystringNo

Delivery guarantee.

Enumeration Value:
  • ATLEAST
  • EXACTLY
ATLEAST / EXACTLY

Response parameters

ParameterTypeDescriptionExample
object
RequestIdstring

Request ID.

B4CAF581-2AC7-41AD-8940-D56DF7AADF5B
JobIdinteger

Job ID.

1

Examples

Sample success responses

JSONformat

{
  "RequestId": "B4CAF581-2AC7-41AD-8940-D56DF7AADF5B",
  "JobId": 1
}

Error codes

For a list of error codes, visit the Service error codes.