All Products
Search
Document Center

Dataphin:Integration component library development instructions

Last Updated:May 28, 2025

Dataphin Data Integration's offline pipeline feature enables you to develop visual components. Once you've created an offline pipeline script, you can easily drag and drop components from a comprehensive library for development. This visual approach simplifies development, enhances efficiency, and streamlines the organization of source and destination data sources. This topic outlines the process for developing an offline pipeline task using the component library.

Prerequisites

To develop an offline pipeline, you must first create the corresponding development script. For more information on creating an offline pipeline script, see create an integration task through a single pipeline.

Offline pipeline component development entry

  1. Navigate to the Dataphin home page and select Development -> Data Integration from the top menu bar.

  2. To access the Offline Pipeline Component development page, follow these steps:

    Choose a Project (Dev-Prod mode requires selecting an environment) -> click Batch Pipeline -> select and click the offline pipeline you wish to develop -> click Component Library.

    image

Offline component library development instructions

Typically, a complete offline pipeline is composed of one or more Inputs, zero or more Transforms and Flows, and one or more Outputs.

On the development page for a single offline pipeline script, click Component Library in the upper right corner to reveal components such as Favorite, Inputs, Transforms, Flows, Outputs, and Custom components.

image

Favorite components

By clicking image, you can access components that have been favorited by the currently logged-in account in other component libraries. This allows you to quickly select and use frequently used components from the favorite component library.

Input components

The origin of the initial data. Depending on the type of your business data, select the appropriate component and drag it onto the pipeline canvas on the left to begin data entry. For an explanation of the functions of each input component, see the configuration details for each component.

  • Input components are not compatible with ancestor nodes.

  • The descendant node of an Input can be a Transform, Output, or Flow.

  • When the Input component is connected to multiple descendant nodes, such as Outputs or Transforms, it is necessary to select a Data Sending Method for the Input component.

    • Replication: The data from the ancestor node is replicated equally among the descendant nodes, with each descendant node receiving the full data set from the ancestor node.

    • Round-robin Distribution: The data from the ancestor node is distributed in a round-robin fashion among the descendant nodes, ensuring the combined data of all descendant nodes equals that of the ancestor node.

Output components

Integrate your data source by selecting the appropriate output component that aligns with your business requirements. Drag your chosen component onto the left pipeline canvas to facilitate data output. For comprehensive information on the functions of each output component, see the configuration details of each component.

Output components are not compatible with descendant nodes.

Flow components

Dataphin provides flow control during data integration through two types of components: throttling and conditional distribution. For an in-depth look at the functions of each component, see the configuration details of each component.

  • Flow components cannot serve as the initial or terminal nodes in an offline pipeline; however, they can be positioned anywhere between the start and end of the pipeline script.

  • When the Flow component is connected to multiple descendant nodes, such as Transforms, Outputs, or Flows, it is necessary to select a Data Sending Method from the Input component.

  • If the Flow component selects the Conditional Distribution component, you must specify the distribution condition when connecting the components:

    • Select Condition Result Is True to send data downstream when the ancestor node's result is true.

    • Select Condition Result Is False to send data downstream when the ancestor node's result is false.

Transform components

This can be utilized to process the source data from input components by performing operations such as computation, filtering, and encryption of data fields. For comprehensive information on the functionalities of each transform component, see the configuration details of each component.

Transform components can be connected to multiple Downstream components, such as Transforms, Outputs, and Flows. It is necessary to specify the Input component's Data Sending Method when establishing these connections.

Directed connections

After selecting the required components, use directed connections to link upstream input components to downstream transform, flow, and output components, forming directed lines. The integration task runtime will execute each component sequentially based on these directed connections. For visual representation of the upstream and downstream relationships when connecting components, see the figure below.

image

Canvas operations

The pipeline canvas supports the simultaneous construction of multiple pipeline scripts. Additionally, right-clicking on the pipeline canvas allows you to perform various operations.

Operation

Description

Copy

Copy existing components on the pipeline canvas.

Paste

Paste the copied pipeline components onto the pipeline canvas.

Delete

Delete the selected components from the canvas.

Select All

Select all components on the pipeline canvas.

Lasso Select

Use the mouse to lasso and select multiple components on the canvas.

Switch to code editor components

For components other than LogicalTable, code editor, and local file, the input and output components in the configuration dialog box support switching to Code Editor mode. Once switched, it cannot be reverted. The following figure uses the MySQL input component as an example.

Before switching

After switching

image

image

Component configuration instructions

For instructions on configuring the components supported by Dataphin, see the table below:

Input components

Component Name

Component Configuration

MYSQL

MySQL input component

Oracle

Oracle input component

Vertica

Vertica input component

FTP

FTP input component

Hive

Configuring the Hive Input Component

HBase

Manage HBase Input Component

LogicalTable

LogicalTable Input Component

AnalyticDB for PostgreSQL

AnalyticDB for PostgreSQL input component

PolarDB

PolarDB input component

Local file

Local File Input Component.

Teradata

Teradata input component.

OceanBase

OceanBase input component

Hologres

Hologres input component.

TDH Inceptor

TDH Inceptor Input Component

DataHub

DataHub input component

DM

DM (Dameng) input component

TiDB

TiDB input component

GBase 8a

GBase 8a Input Component

SAP Table

SAP Table input component

StarRocks

StarRocks input component

Elasticsearch

Elasticsearch input component

ArgoDB

ArgoDB Input Component

Salesforce

Salesforce input component.

SelectDB

SelectDB input component.

Microsoft SQL Server

Microsoft SQL Server input component

PostgreSQL

PostgreSQL input component.

PolarDB-X (formerly DRDS)

PolarDB-X (formerly DRDS) input component

HDFS

HDFS Input Component

MaxCompute

MaxCompute input component.

MongoDB

MongoDB Input Component.

AnalyticDB for MySQL 3.0

AnalyticDB for MySQL 3.0 input component

Log Service

Log Service input component

OSS

OSS input component

SAP HANA

SAP HANA input component

IBM DB2

IBM DB2 input component

Code editor input

Editor input component

ClickHouse

ClickHouse input component

Kafka

Kafka input component

API

API Input Component

KingbaseES

KingbaseES Input Component

GoldenDB

GoldenDB Input Component

Impala

Impala Input Component

OpenGauss

OpenGauss input component

Kudu

Configure Kudu input component.

Greenplum

Greenplum Input Component

Doris

Doris Input Component

Amazon S3

Amazon S3 Input Component

Lindorm (Compute Engine)

Lindorm Input Component

Output components

Component Name

Configuration Instructions

MYSQL

MySQL Output Component

Oracle

Oracle output component

Vertica

Vertica output component .

FTP

FTP output component

Hive

Setting Up the Hive Output Component

HBase

HBase output component.

AnalyticDB for MySQL 2.0

AnalyticDB for MySQL 2.0 output component

AnalyticDB for MySQL 3.0

AnalyticDB for MySQL 3.0 output component

PolarDB

PolarDB output component

SAP HANA

SAP HANA output component

IBM DB2

IBM DB2 output component

Output from the code editor

Configuration of the code editor component

ClickHouse

ClickHouse output component

Kafka

Kafka output component

KingbaseES

KingbaseES output component

GoldenDB

GoldenDB output component

Impala

Impala Output Component

StarRocks

StarRocks Output Component

Greenplum

Greenplum Output Component

ArgoDB

Configure ArgoDB Output Component

Amazon S3

Amazon S3 output component

Microsoft SQL Server

Microsoft SQL Server output component

PostgreSQL

PostgreSQL output component

PolarDB-X (formerly known as DRDS)

PolarDB-X output component.

HDFS

HDFS Output Component

MaxCompute

MaxCompute output component .

MongoDB

MongoDB output component

Elasticsearch

Elasticsearch output component.

AnalyticDB for PostgreSQL

AnalyticDB for PostgreSQL output component

OSS

OSS output component

Teradata

Teradata Output Component

OceanBase

OceanBase output component

Hologres

Hologres output component

TDH Inceptor

TDH Inceptor output component

DataHub

DataHub output component

DM

DM (Dameng) output component

TiDB

TiDB output component

GBase 8a

GBase 8a Output Component

OpenGauss

OpenGauss Output Component

API

API Output Component

Redis

Redis output component

Doris

Doris Output Component

SelectDB

SelectDB output component.

Lindorm (compute engine)

Lindorm Output Component

Transform components

Component Name

Component Configuration

Field Selection

Field Selection Transform Component

Signature Calculation

Field Calculation Transform Component

Filter

Filter Transform Component

Encryption

Encrypt Transform Component

Decryption

Decrypt Transform Component

Flow components

Component Name

Configuration Instructions

Throttling

Throttling Flow Component

Conditional Distribution

Conditional Distribution Flow Component

Custom components

To utilize custom components in Dataphin, you must first create them within the platform. Once created, you can select and employ them as needed. For detailed instructions, refer to creating an offline custom source type.