Meta tables are cross-storage type tables managed through data management. You can create and manage input tables, output tables, and dimension tables used in the development process by creating meta tables. This topic describes how to create and manage meta tables.
Benefits
Meta tables provide the following benefits:
Security and reliability: Meta tables effectively prevent sensitive information leakage caused by directly writing native Flink DDL statements.
Improved efficiency and experience: You can reference a table multiple times after creating it once. You do not need to repeatedly write DDL statements or perform complex input, output, or dimension table mapping. This simplifies development and improves efficiency and experience.
Asset lineage: Meta tables maintain upstream and downstream asset lineage information.
Functions of meta tables
With meta tables, you can implement the following scenarios:
Platform-based: Uniformly maintain all real-time meta tables and related schema information.
Asset-based: Uniformly configure and manage tables in the real-time development process.
Meta table page introduction

Area | Description |
①Operation bar | Supports save, submit, unpublish, refresh, edit lock, and locate operations. |
②Meta table basic information | Basic information of the meta table, including the meta table name, data source type, data source name, source table name, and connector name. Note
|
③Meta table structure operations | Supports searching table fields, adding fields, exporting Flink DDL, sorting, and parsing operations. The following methods are supported for adding fields.
|
④Meta table field list | Displays the meta table fields parsed by the system. Includes ordinal number, field name, whether it is metadata, Flink field type, original field type, description, and supports edit and delete operations. |
⑤Configure meta table | Supports configuring meta table properties and viewing historical versions of the meta table.
|
Procedure
Step 1: Create a meta table
On the Dataphin homepage, choose Development > Data Development from the top navigation bar.
Select a Project from the top navigation bar, and then choose Data Processing > Tables from the left navigation pane.
Click the
icon in the Tables list to open the Create Table dialog box.In the Create Table dialog box, configure the parameters.
Parameter
Description
Table Type
Select Meta Table.
Meta Table Name
Enter the name of the meta table. The name must meet the following requirements:
It can contain only letters, digits, and underscores (_), and cannot start with a digit.
It cannot exceed 64 characters in length.
Data Source
For more information about real-time data sources supported by Dataphin and the types of tables that can be created, see Real-time data sources supported by Dataphin.
You can also customize real-time data source types. For more information, see Create a custom data source type.
After you select a data source, you need to configure the corresponding information based on the data source type. For more information, see Appendix: Meta table data source configuration parameters
Source Table
Enter or select a source table.
NoteIf you set the data source to Hive, you can also select a Hudi or paimon table. In the source table drop-down list, Hudi tables are marked with the
icon and paimon tables are marked with the
icon.When the data source is selected as Log Service, DataHub, Kafka, Elasticsearch, Redis, or RabbitMQ, configuring a source table is not supported.
Select Directory
The default selection is Table Management. You can also create a target folder on the Table Management page and select that target folder as the directory for the meta table. Perform the following steps:
Click the
icon above the table management list on the left side of the page to open the Create Folder dialog box.In the Create Folder dialog box, enter the folder Name and Select A Directory location as needed.
Click OK.
Description
Enter a brief description within 1,000 characters.
Click OK to complete the creation of the meta table.
Step 2: Add fields
Dataphin meta tables support the following three methods to add fields:
If the meta table data source is Hive and the source table is a Paimon table, the field list is retrieved from metadata and cannot be edited.
Adding fields by SQL import
On the real-time meta table page, click +Add Field and select SQL Import.
In the SQL Import dialog box, write the SQL code.
NoteDataphin will provide reference examples based on your data source type. You can click the reference example
in the window to view the corresponding code examples.After completing the code, you can click format
to adjust the format of your code with one click.If you select Import Parameter Values From With Parameters, the values in the with parameters will be imported together.
The code example for MySQL data source is as follows:
create table import_table ( retailer_code INT comment '' ,qty_order VARCHAR comment '' ,cig_bar_code INT comment '' ,org_code INT comment '' ,sale_reg_code INT comment '' ,order_date TIMESTAMP comment '' ,PRIMARY KEY(retailer_code) ) with ( 'connector' = 'mysql' ,'url' = 'jdbc' ,'table-name' = 'ads' ,'username' = 'dataphin' );Click OK to complete the addition of fields.
Adding fields by batch import
On the real-time meta table page, click +Add Field and select Batch Import.
In the Batch Import dialog box, write SQL code according to the batch import format.
Batch import format
Field name||Field type||Description||Is primary key||Is metadataExample
ID||INT||Description||false||false name||INT||Description||false||false
Click OK to complete the addition of fields.
Adding fields by single-line addition
On the real-time meta table page, click +Add Field and select Single-line Addition.
In the Single-line Addition dialog box, configure the parameters.
Parameter
Description
Is Metadata
The default is No. If you select Yes, you do not need to fill in whether it is a primary key and the original field type, but you need to select the Flink SQL field type.
Field Name
Enter a field name.
The field name can contain only letters, digits, underscores (_), and periods (.), and cannot start with a digit.
Is Primary Key
Select whether the field is a primary key as needed.
NoteIf your data source is Kafka and the connector is Kafka, select whether it is a message key.
If your data source is HBase, select RowKey.
Field Type and Original Field Type
HBase does not have an original field type. You need to select a Flink SQL field type. In addition, if the field is not a RowKey, you need to fill in the column family.
If the Flink SQL field type and the original field type of the meta table have a many-to-one relationship, you need to select the Flink SQL field type. The original field type is mapped from the Flink SQL field type. In this case, the original field type is only displayed and cannot be edited, such as Kafka.
If the Flink SQL field type and the original field type of this data source have a one-to-many relationship, first select the original field type. After selecting the original field type, editing is allowed, and you can manually add precision, such as MySQL, Oracle, PostgreSQL, Microsoft SQL Server, Hive, and other data sources.
Click OK to complete the addition of the field.
Step 3: Configure meta table properties
After creating the meta table, click the Properties button on the right to configure the Basic Information, Meta Table Parameters, Reference, and modify the Debug Test data table.
Parameter
Description
Basic Information
Meta Table Name
The default is the name of the created meta table and cannot be modified.
Datasource
The default is the created data source type.
Data Source Parameters
Different compute engines support different data sources, and different data sources require different configuration parameters. For more information, see Appendix: Meta table data source configuration parameters.
Description
Enter a description of the meta table within 1,000 characters.
Meta Table Parameters
Parameter Name
Different meta table parameters are provided based on the data source type. You can select from the dropdown to get the meta table parameters supported by the data source and their corresponding descriptions, or you can fill them in manually. To add a parameter, you can click Add Parameter.
The number of parameters cannot exceed 50. Parameter names can only contain digits, letters, underscores (_), hyphens (-), periods (.), colons (:), and forward slashes (/).
Parameter Value
Parameter values provide options based on the parameter type. If there are no options, you need to enter them manually. Single quotes are not supported. For example: Parameter name: address, Parameter value: Ningbo.
Actions
You can click
to delete the corresponding parameter.Reference
Flink Task Name
Displays the names of Flink tasks that reference this meta table.
NoteDraft tasks are not included in the reference information.
Default Read During Task Debugging
Set the data table to be read by default during task debugging. You can select either the production table or the development table.
If you select to read the production table, you can read the data from the corresponding production table during debugging, which poses a risk of data leakage. Please proceed with caution.
If you set the task to read the production table by default during debugging, you need to apply for development and production data source permissions for your personal account. For information about how to apply for data source permissions, see Apply for data source permissions.
NoteHive tables and paimon tables do not support debugging.
Read During Development Environment Testing
Set the data table to be read by default during task testing. You can select either the production table or the development table.
If you select to read the production table, you can read the data from the corresponding production table during testing, which poses a risk of data leakage. Please proceed with caution.
If you set the development environment to read the production table by default during testing, you need to apply for development and production data source permissions for your personal account. For information about how to apply for data source permissions, see Apply for data source permissions.
Write During Development Environment Testing
You can select either the current source table or another test table. If you select another test table, you need to select the corresponding table.
Click OK.
Step 4: Submit or publish the meta table
Click Submit in the menu bar at the top left of the meta table page.
In the Submit Remarks dialog box, enter your remarks.
Click OK And Submit.

If the project mode is Dev-Prod, you need to publish the meta table to the production environment. For more information, see Manage publish tasks.
Appendix: Meta table data source configuration parameters
Data Source | Configuration | Description |
MaxCompute |
| Source Table: The source table of the data. blinkType: You can select odps or continuous-odps.
|
| Source Table | Source Table: The source table of the data. |
SAP HAHA |
| Source Table: The source table of the data. Update time field: Select a field of the timestamp type in the SAP HAHA table that indicates the update time, or enter a HANA SQL time string expression, such as |
| Source Topic | Source Topic: The source topic of the data. |
|
|
|
Kafka |
|
|
Hudi |
|
|
Elasticsearch |
|
|
Redis | None | |
RabbitMQ |
|
|
TDH Inceptor |
|
|
What to do next
After creating a meta table, you can develop real-time tasks based on it. For more information, see: