Create External Data Source Configuration
Debugging
Authorization information
Request parameters
| Parameter | Type | Required | Description | Example |
|---|---|---|---|---|
| RegionId | string | No | Region ID. Note
You can call the DescribeRegions API to view available region IDs.
| cn-beijing |
| DBInstanceId | string | Yes | Instance ID. | gp-bp10g78o9807yv9h3 |
| DataSourceId | string | Yes | Data source ID. | 1 |
| JobDescription | string | No | Job description. | test-job |
| JobName | string | Yes | Job name. | test-job |
| Mode | string | No | Configuration mode
Enumeration Value:
| basic / professional |
| SrcColumns | array | No | Source data field list. | |
| string | No | Source data field key. | src_column_1 | |
| DestColumns | array | No | Target data table mapping field list. | |
| string | No | Mapping field key in the target table. | dest_column_1 | |
| Account | string | No | Target database account. | test-account |
| Password | string | No | Target database password. | pwd123 |
| DestDatabase | string | No | Target database name. | dest-db |
| DestSchema | string | No | Target namespace. | dest-schema |
| DestTable | string | No | Target table name. | dest-table |
| WriteMode | string | No | Write mode. Enumeration Value:
| INSERT/UPDATE/MERGE |
| TryRun | boolean | No | Whether to test the real-time task, values:
Default value: false. | true |
| JobConfig | string | No | Job configuration file, required for professional mode. | DATABASE: adbpgss_test USER: adbpgss_test PASSWORD: adbpgssTest HOST: gp-xxx-master.gpdb.rds-aliyun-pre.rds.aliyuncs.com PORT: 5432 KAFKA: INPUT: SOURCE: BROKERS: broker1:9092,broker2:9092,broker3:9092 TOPIC: testtopic FALLBACK_OFFSET: earliest KEY: COLUMNS: - NAME: customer_id TYPE: int FORMAT: delimited DELIMITED_OPTION: DELIMITER: '|' VALUE: COLUMNS: - TYPE: integer NAME: l_orderkey - TYPE: integer NAME: l_partkey - TYPE: integer NAME: l_suppkey - TYPE: integer NAME: l_linenumber - TYPE: decimal NAME: l_quantity - TYPE: decimal NAME: l_extendedprice - TYPE: decimal NAME: l_discount - TYPE: decimal NAME: l_tax - TYPE: char NAME: l_returnflag - TYPE: char NAME: l_linestatus - TYPE: date NAME: l_shipdate - TYPE: date NAME: l_commitdate - TYPE: date NAME: l_receiptdate - TYPE: text NAME: l_shipinstruct - TYPE: text NAME: l_shipmode - TYPE: text NAME: l_comment FORMAT: delimited DELIMITED_OPTION: DELIMITER: '|' ERROR_LIMIT: 10 OUTPUT: SCHEMA: adbpgss_test TABLE: write_with_insert_plaintext MODE: MERGE MATCH_COLUMNS: - l_orderkey - l_partkey - l_suppkey UPDATE_COLUMNS: - l_linenumber - l_quantity - l_extendedprice - l_discount - l_tax - l_returnflag - l_linestatus - l_shipdate - l_commitdate - l_receiptdate - l_shipinstruct - l_shipmode - l_comment MAPPING: - EXPRESSION: l_orderkey NAME: l_orderkey - EXPRESSION: l_partkey NAME: l_partkey - EXPRESSION: l_suppkey NAME: l_suppkey - EXPRESSION: l_linenumber NAME: l_linenumber - EXPRESSION: l_quantity NAME: l_quantity - EXPRESSION: l_extendedprice NAME: l_extendedprice - EXPRESSION: l_discount NAME: l_discount - EXPRESSION: l_tax NAME: l_tax - EXPRESSION: l_returnflag NAME: l_returnflag - EXPRESSION: l_linestatus NAME: l_linestatus - EXPRESSION: l_shipdate NAME: l_shipdate - EXPRESSION: l_commitdate NAME: l_commitdate - EXPRESSION: l_receiptdate NAME: l_receiptdate - EXPRESSION: l_shipinstruct NAME: l_shipinstruct - EXPRESSION: l_shipmode NAME: l_shipmode - EXPRESSION: l_comment NAME: l_comment COMMIT: MAX_ROW: 1000 MINIMAL_INTERVAL: 1000 CONSISTENCY: ATLEAST POLL: BATCHSIZE: 1000 TIMEOUT: 1000 PROPERTIES: group.id: testgroup |
| GroupName | string | No | Kafka group name | group_name. |
| FallbackOffset | string | No | FallbackOffset, fallback offset
Enumeration Value:
| EARLIEST / LATEST |
| MatchColumns | array | No | Match columns, usually all primary key columns of the target table. If all column values in this configuration are the same, the two rows of data are considered duplicates. | |
| string | No | Match field key. | column_1 | |
| UpdateColumns | array | No | Update columns, usually all non-primary key columns of the target table. When data is determined to be duplicate through MatchColumns, updating the UpdateColumns column values will result in new data overwriting old data. | |
| string | No | Update field key. | column_1 | |
| ErrorLimitCount | long | No | When data in Kafka does not match the ADBPG target table, it will cause a write failure. This value is the number of error rows allowed; exceeding this will cause the task to fail. | 5 |
| Consistency | string | No | Delivery guarantee. Enumeration Value:
| ATLEAST / EXACTLY |
Response parameters
Examples
Sample success responses
JSONformat
{
"RequestId": "B4CAF581-2AC7-41AD-8940-D56DF7AADF5B",
"JobId": 1
}Error codes
For a list of error codes, visit the Service error codes.
