Dataphin lets you create quality rules to check data tables and improve data quality monitoring. This topic describes how to configure quality rules for tables in Dataphin.
Prerequisites
You must add monitored objects before you can configure quality rules. For more information, see Add Monitored Objects.
Permissions
Super administrators, quality administrators, custom global roles with Quality Rule - Management permissions, and custom project roles with Project Quality Management - Quality Rule Management permissions for the project that contains the table can configure scheduling, alerts, exception archive tables, and scoring weights for quality rules.
Quality owners can configure schedules, alerts, exception archive tables, and score weights for quality rules for the monitored objects they are responsible for.
Quality owners and regular users must also have read permissions for Dataphin tables. To request permissions, see Apply, Renew, and Return Table Permissions.
Operation permissions vary by object. For more information, see Quality Rule Operation Permissions.
Quality rule configuration
Quality rules support Custom Configuration and From Standard options, with the latter requiring the data standard module to be enabled.
Custom Configuration lets you quickly create quality rules using built-in or custom templates. This option also supports custom SQL for flexible monitoring.
From Standard uses quality rules that are configured in the data standards associated with the current asset object. This ensures adherence to standard constraints.
Check rules
When a data table is checked against quality rules, if a soft rule is triggered, the system sends an alert for you to handle the exception. If a strong rule is triggered, the system stops the associated task, which prevents bad data from flowing to downstream nodes. The system also sends an alert for you to handle the exception.
Trial run vs. normal run
A trial run differs from a normal run in its execution method and how results are displayed. A trial run is a simulated execution of a quality rule to check its correctness and performance, and its results are not displayed in the quality report. In contrast, a run executes a quality rule at a specific time, and its results are sent to the quality report for you to view and analyze.
Configure a quality rule
On the Dataphin homepage, click Administration > Data Quality in the top menu bar.
Click Quality Rule in the navigation pane on the left. On the Dataphin Table page, click the name of the target object to open the Quality Rule Details page and configure quality rules.
Custom Configuration
Hover over Create Quality Rule and select Custom Configuration. Alternatively, click Create Quality Rule to open the Create Quality Rule dialog box.
In the Create Quality Rule dialog box, configure the parameters.
Parameter
Description
Basic information
Rule Name
The name of the custom quality rule. The maximum length is 256 characters.
Rule Strength
Supports Weak Rule and Strong Rule.
Weak Rule: If you select Weak Rule, an alert is triggered when a quality check fails, but it does not block downstream task nodes.
Strong Rule: If you select Strong Rule, an alert is triggered when a quality check fails. If downstream tasks exist, such as from code check scheduling or task trigger scheduling, it blocks them to prevent the diffusion of bad data. If no downstream tasks exist, such as in periodic quality scheduling, it only triggers an alert.
Description
The description of the custom quality rule. The maximum length is 128 characters.
Configuration Method
Template Creation: Use general system templates and custom business templates to quickly create quality rules.
System Template: This template has configurable built-in parameters and is suitable for creating general rules.
Custom Template: This template has preset parameters and requires no configuration. It is typically used to create rules that contain business logic.
SQL: Use SQL to create custom quality monitoring rules for flexible and complex scenarios.
Rule Template
Select a rule template from the drop-down list: Completeness, Uniqueness, Timeliness, Validity, Consistency, Stability, or SQL.
Completeness: Includes Null Value Validation and Empty String Validation.
Uniqueness: Includes Uniqueness Validation, Field Group Count Validation, and Duplicate Value Count Validation.
Timeliness: Includes Time Comparison With Expression, Time Interval Comparison, and Time Interval Comparison In Two Tables.
Validity: Includes Column Format Validation, Column Length Validation, Column Value Domain Validation, Reference Table Validation, and Standard Reference Table Validation (requires activation of the Data Standard module).
Consistency: Includes Columns Value Consistency Validation, Columns Statistical Consistency Validation, Single Field Business Logic Consistency Comparison, Columns In Two Tables Value Consistency Validation, Columns In Two Tables Statistical Consistency Validation, Columns In Two Tables Processing Logic Consistency Validation, and Cross-Source Columns Statistical Consistency Validation.
Stability: Includes Table Stability Validation, Table Volatility Validation, Column Stability Validation, and Column Volatility Validation.
SQL: Includes Custom Statistic Validation and Custom Detail Value Validation.
For more information, see Template Type Description.
Rule Type
The rule type is the most basic property of a template. It is used for description and filtering.
Monitoring Granularity
When the configuration type is Use Custom SQL Statement to Create a Dataset, you can configure the monitoring granularity. You can choose Entire Table or specify a monitoring Field.
Template Configuration
Template Information
When you select a quality rule template, its configuration is displayed. Modify the configuration in Quality Rule Template as needed.
Rule configuration
Rule Configuration
The rule configuration varies depending on the selected rule template. For more information, see Data Table Parameter Configuration.
Special configurations are as follows:
Validation Table Data Filtering: This feature is disabled by default. When enabled, you can configure a filter condition, partition filter, or general data filter for the validation table. The filter condition is directly appended to the validation SQL. If the validation table requires partition filtering, configure the partition filter expression in the schedule configuration. After configuration, the validation partition is used as the minimum granularity for viewing quality reports.
When you select Consistency/Two Tables Field Statistical Value Consistency Comparison or Consistency/Cross-Source Two Tables Field Statistical Value Consistency Comparison as the rule template, you can enable Comparison Table Data Filtering. When enabled, you can configure a filter condition, partition filter, or general data filter for the comparison table. The filter condition is directly appended to the validation SQL.
Validation configuration
Rule Validation
After a Data Quality rule validation, the result is compared with the abnormal validation configuration. If the conditions are met, the validation fails and triggers subsequent flows, such as alerts.
The template and configuration determine the available metrics for abnormal validation. The validation supports multiple conditions combined with AND/OR operators. Use fewer than three conditions in the actual configuration.
For more information, see Validation Configuration Description.
Archive Configuration
Data Anomaly Archiving
This is Off by default. When enabled, you can archive abnormal data to files or tables. After a quality check, you can download and analyze the archived abnormal data.
ARCHIVELOG Mode supports Archive Only Abnormal Fields and Archive Complete Records.
Archive Only Abnormal Fields: Removes duplicates and archives only the current monitored field. Use this option when a single field is enough to identify the abnormal data.
Archive Complete Records: Archives the entire record that contains the abnormal data. Use this option when the complete record is required to locate the abnormal data.
NoteBecause archiving complete records significantly increases the amount of archived data, use the Archive Only Abnormal Fields option in most cases.
Archive Location supports Default File Server and Archive Table For Exception Data. If an exception archive table does not exist, click Manage Exception Archive Table to create one. For more information, see Add an exception archive table.
Default File Server: This is the system file server configured when Dataphin is deployed. You can download abnormal data directly from the Validation Records > Validation Details page or access the default file server to retrieve the data. When you use the default file server, a maximum of 100 abnormal data records are archived for each validation. This option is suitable for scenarios with small data validation volumes.
Archive Table For Exception Data: If you want to store more abnormal data or consolidate abnormal data from different validation records for later comparison and analysis, specify a custom archive table. Each quality rule can record a maximum of 10,000 abnormal data records per run. Besides downloading abnormal data from a single validation on the Validation Records page, you can also directly access the archive table and customize its lifecycle for greater flexibility.
NoteYou can download a summary of abnormal data generated by all rules in the current run. The download is limited to 1,000 records. To view more data, archive it to the specified exception archive table and then access the table directly to retrieve the data.
The exception archive table must meet specific format requirements. Otherwise, errors might occur when writing data and affect usage. For more information, see Add an exception archive table.
Business property configuration
Property information
How you set business properties depends on the configuration of the quality rule properties. For example, the property for the managing department is a multi-select enumeration. The available values are Big Data Department, Business Department, and Technical Department. Therefore, when you create a quality rule, this property is a multi-select drop-down list with these options.
The value type for the rule owner property is custom input, with a maximum length of 256 characters. Therefore, when you create a quality rule, the value for this property can be up to 256 characters long.
If a property field is set to Range Interval, configure it as follows:
Range Interval: Use this option for a value range of continuous numbers or dates. You can select from four symbols: >, >=, <, and <=. For more information about property configuration, see Create and manage quality rule properties.
Scheduling property configuration
Scheduling Method
You can select a configured schedule. If you have not decided on a scheduling method, you can configure it after you create the quality rule. To create a new schedule, see Create a Schedule.
Quality Score Configuration
Scoring Method
Two scoring methods are available: quality check status and data pass rate.
Quality check status: The score is based on the status of the most recent successful check for the rule. A passed check receives 100 points. A failed check receives 0 points.
Data pass rate: The score is the percentage of valid data from the most recent successful check. For example, if 80% of the data has a valid format, the quality score is 80.
Different rule templates support different scoring methods. The following rule types support only the Quality check status method:
Uniqueness rules, such as field group count checks and duplicate field value count checks.
Consistency rules, such as single-table field statistic comparisons and cross-source two-table field statistic comparisons.
Stability rules.
Custom SQL rules that check custom statistical metrics.
Quality Score Weight
The weight of the quality rule. This value helps calculate the quality score for the monitored object. Select an integer from 1 to 10.
Click OK to save the custom rule configuration.
You can click Preview SQL to review SQL changes by comparing the current configuration with the last saved version.
NoteIf key information is missing, Preview SQL is unavailable.
The left pane displays the SQL preview of the last saved configuration. If no configuration was saved, this pane is empty. The right pane displays the SQL preview of the current configuration.
Reference Data Standard Monitoring
Hover over the data table, click Create Quality Rule, and then select From Standard.
In the From Standard dialog box, select the data standard rules that you want to reference. You can filter the standard rules by template, such as Validity, Uniqueness, Completeness, and Stability, or search by object name.
In the dialog box for referencing data standard rules, you can change the rule name and enable or disable the rule. You can also click the standard code under the associated standard to view standard details, or click the
icon in the Actions column to view the quality rule.NoteAfter you reference a rule, you cannot change its details. You can configure the schedule and change the rule strength.
Click Add Selected Rules to finish referencing the data standard rules.
Rule configuration list
After a quality rule is created, you can view, edit, trial run, run, and delete it from the rule configuration list.

Region | Description |
①Filter and search area | Supports quick search by object or rule name. Supports filtering by rule type, rule template, rule strength, trial run status, effective status, and rule source. Note If a quality rule's business attribute is searchable, filterable, and enabled, you can also use it to search and filter. |
②List area | Displays the object type/name, rule name/ID, trial run status, effective status, rule type, rule template, rule strength, schedule type, and related knowledge base document information for the rules in the list. Click the
|
③Operation Area | You can view, clone, edit, trial run, run, configure schedules, associate knowledge base documents, configure quality scores, and delete rules.
|
④Batch Operation Area | You can perform batch operations, such as trial run, run, configure schedule, enable, disable, modify business properties, associate knowledge base documents, configure quality scores, export rules, and delete.
|
Create a schedule
When you configure a schedule for a rule, you can quickly configure it based on existing schedules. Each table can have up to 20 scheduling rules.
A single rule can have up to 10 schedules.
If schedule configurations are identical, they are automatically deduplicated.
If the current table is a Hologres partitioned table, we recommend using a fixed task-triggered schedule.
The check scope is passed as a filter condition to the quality check statement to control the scope of each quality check. The check scope is also the basic unit for downstream components, such as quality reports. Quality reports are viewed at the minimum granularity of the check scope.
On the Quality Rule Details page, click the Scan Configuration tab and then click the Create Schedule button to open the Create Schedule dialog box.
In the Create Schedule dialog box, configure the parameters.
Parameter
Description
Schedule Name
Enter a custom name for the schedule. The name must be 64 characters or less.
Schedule Type
Supports Recurrency Triggered, Data Update Triggered, and Task Triggered.
Recurrency Triggered: Performs scheduled, recurring quality checks on data based on the configured time. This trigger is suitable for scenarios where data generation times are fixed.
Recurrence: Running quality rules consumes computing resources. Avoid running multiple quality rules concurrently to prevent affecting normal production tasks. The schedule can be set to Day, Week, Month, Hour, or Minute.
If the system time zone (the time zone in the User Center) does not match the schedule time zone (the time zone configured in Management Hub > System Settings > Basic Settings), the rule runs based on the system time zone.
Fill In Recommended Time: Click Fill In Recommended Time. The system recommends a time based on the average completion time of the tasks that generate the current table.
Data Update Triggered: When any code task runs, the system checks if the task run updates the specified check range of the current table. This trigger is suitable for tables that are modified by various tasks or for tables that require close monitoring, where every change must be checked.
NoteSet the check range to the partitions updated by the task. For non-partitioned tables, the entire table is checked. The system automatically detects and checks all data changes to prevent omissions.
Task Triggered: Runs the configured quality rule after or before a specified task runs successfully. The rule can be triggered by tasks of the following node types: DPI engine SQL, offline pipeline, Python, Shell, Virtual, Datax, Spark_jar, Hive_MR, and database SQL. This trigger is suitable for scenarios where table modification tasks are fixed.
NoteFor a task-triggered rule, you can select only tasks in the production environment. If the rule is configured as a strong rule and the check fails, online tasks may be affected. Proceed with caution as needed.
Trigger Timing: Select when to trigger the quality check. You can select Trigger After All Tasks Run Successfully, Trigger After Each Task Runs Successfully, or Trigger Before Each Task Runs.
Triggering Task: Lets project administrators or users with the O&M system role select task nodes in a production project. Search for nodes by output name, or select them from the recommended or all tasks lists.
Recommended Tasks: Displays lineage tasks where the current table serves as the output table and the node output name is in the `section name/project name.table name` format, i.e., the task scope of Asset Directory > Asset Details > Output Information.
All Tasks: Displays all production tasks for which the current user has O&M permissions.
NoteIf you set Trigger Timing to Trigger after all tasks run successfully, select triggering tasks that have the same schedule epoch. This prevents different schedule epochs from causing rule execution delays and late quality check results.
If you set Trigger Timing to Trigger before each task runs, the Recommended Tasks list displays lineage tasks where the current table is an input.
Schedule Condition
This option is disabled by default. When enabled, the system first checks if scheduling conditions are met before the quality rule runs. The rule runs only if the conditions are met. If not, the system ignores the run.
Data Timestamp/Executed On: If the schedule type is Recurrency Triggered (timed scheduling does not support execution date), Data Update Triggered, or Task Triggered, you can configure the date. Select Regular Calendar or Custom Calendar. For more information about how to create a custom calendar, see Create a Public Calendar.
If you select Regular Calendar, the available conditions are Month, Week, and Date. For example, see the image below:

If you select Custom Calendar, the available conditions are Date Type and Tag. For example, see the image below:

Instance Type: If the schedule type is Data Update Triggered or Task Triggered, you can configure the instance type. Select Recurring Instance, Data Backfill Instance, or One-time Instance. For example, see the image below:

NoteConfigure at least one rule. To add a rule, click the + Add Rule button.
You can configure a maximum of 10 scheduling conditions.
You can set the relationship between scheduling conditions to AND or OR.
Validation range
If the scheduling type is Timed Scheduling or Fixed Node Triggered Scheduling, the Custom Validation Range option is available. If the scheduling type is Data Update Triggered Scheduling, the available options are Updated Partition
and Custom Validation Range.
Updated Partition: If a partition is updated in the check node, the node is dispatched based on the updated partition.
NoteIn dynamic partition scenarios, partitions may not be parsed, and no quality check is performed.
Volatility check rules, such as checks on partition size, row count, and field statistics, require a partition to be specified. These rules do not support the updated partition validation range.
If data in a non-partitioned table is updated, the entire table is checked.
Custom Validation Range: For scenarios where partitions cannot be parsed, use a custom validation range to specify the validation range expression based on the data timestamp or execution date.
Validation Range Expression: This is a drop-down list that accepts custom input. Enter the range to check, such as
ds='${yyyyMMdd}', or select and modify a built-in expression for quick configuration. For more information about partition filter expressions, see Built-in partition filter expression types.NoteIf multiple conditions are required, connect them with and or or. For example:
province="Zhejiang" and ds<=${yyyyMMdd}.If a filter condition is configured in the quality rule, it is combined with the validation range expression using an AND operator. During data validation, both conditions are used for filtering.
The validation range expression supports full table scans.
Note: A full table scan consumes significant resources, and some checks do not support this feature. Configure a partition filter expression to avoid a full table scan.
Validation Range Budget: Default: The data timestamp of the current day.
Click OK to save the schedule configuration.
Schedule configuration list
After a schedule is created, you can view, edit, clone, and delete it from the schedule configuration list.

Region | Description |
①Filter and search area | Supports quick search by schedule name. Supports filtering by Recurrency Triggered, Data Update Triggered, and Task Triggered. |
②List area | Displays the Schedule Name, Schedule Type, Last Updated By, and Last Updated Time of the rule configurations. |
③Operation area | You can edit, clone, or delete the schedule.
|
Alert configuration
You can configure different alert methods for different rules to distinguish between alerts. For example, you can configure phone alerts for strong rule exceptions and text message alerts for soft rule exceptions. If a rule matches multiple alert configurations, you can set a policy to determine which alert takes effect.
A single monitored object can have up to 20 alert configurations.
On the Quality Rule Details page, click the Alert Settings tab and then click the Create Alert Configuration button to open the Create Alert Configuration dialog box.
In the Create Alert Configuration dialog box, configure the parameters.
Parameter
Description
Coverage
Lets you select All Rules, All Strong Rules, All Soft Rules, and Custom.
NoteFor a single monitored object, you can configure one alert for each of the following three scopes: All Rules, All Strong Rules, and All Soft Rules. New rules automatically match the corresponding alert based on their strength. To change an alert configuration, modify the existing one.
The Custom scope lets you select up to 200 of the rules configured for the current monitored object.
Alert Configuration Name
For a single monitored object, the alert configuration name must be unique and cannot exceed 256 characters.
Alert Recipient
Configure alert recipients and methods. Select at least one recipient and one method.
Alert Recipient: Choose from custom recipients, a shift schedule, or the quality owner.
You can configure up to 5 custom recipients and 3 shift schedules.
Alert Method: Choose one or more alert methods, such as phone, email, text message, DingTalk, Lark, WeCom, and custom channel. You can manage these methods in Configure Channel Settings.
Click OK to save the alert configuration.
Alert configuration list
After an alert configuration is created, you can sort, edit, and delete it from the alert configuration list.

Ordinal Number | Description |
① Sort area | Configure the alert policy that applies when a quality rule matches multiple alert configurations:
|
② List area | Displays the name of the alert configuration, its effective scope, the specific recipients for each alert type, and the corresponding notification method. Effective Scope: For custom alerts, the scope includes the configured object name and rule name. If a rule is deleted, its corresponding object name cannot be viewed. Update the alert configuration. |
③ Operation area | You can edit and delete configured alerts.
|
Add an exception archive table
The exception archive table stores records of quality rule check exceptions.
On the Quality Rule Details page, click the Archive tab, and then click the + Add Exception Archive Table button to open the Add Exception Archive Table dialog box.
In the Add Exception Archive Table dialog box, configure the parameters.
The Add Method parameter has two options: Create New Table and Select Existing Table. The archive table does not overwrite the original data table and includes special quality validation fields.
Create New Table: You can customize the table name. The name can contain letters, numbers, underscores (_), and periods (.), and cannot exceed 128 characters. By default, the name is populated with current_table_name_exception_data. The new table must be in the same project or module as the table being archived. After the table is created, it is stored in the same database or data source.
If the monitored table is a physical table, the archive table is created in the same project as the monitored table.
If the monitored table is a logical dimension table or a logical fact table, the archive table is created in the same project as the monitored table by default. You can also manually specify a project under the monitored table's workspace, such as projectA.table_name.
If the monitored table is a logical aggregate table, we recommend specifying a project name under the same workspace for the archive table. Otherwise, it is automatically archived to a project under the monitored table's workspace.
The archive table must contain all the fields from the quality monitoring table and the check fields. The script format is as follows:
create table current_table_name_exception_data (dataphin_quality_tenant_id varchar(64) comment 'Tenant ID' , dataphin_quality_rule_id varchar(64) comment 'Quality Rule ID', dataphin_quality_rule_name varchar(256) comment 'Quality Rule Name', dataphin_quality_column_name varchar(1024) comment 'Validation Field Name', dataphin_quality_watch_task_id varchar(128) comment 'Monitored Object Task ID', dataphin_quality_rule_task_id varchar(64) comment 'Rule Task ID', dataphin_quality_validate_time varchar(64) comment 'Quality Validation Time', dataphin_quality_archive_mode varchar(32) comment 'Abnormal Data Archiving Mode, ONLY_ERROR_FIELD/FULL_RECORD', dataphin_quality_error_data string comment 'Abnormal Data', ljba_id bigint comment 'ljba Primary Key', ljb_id bigint comment 'ljb Primary Key', col_tinyint tinyint comment 'Field type is TINYINT and lowercase', col_tinyint_02 tinyint comment '2', col_smallint smallint comment 'Field type is SMALLINT and lowercase', col_smallint_02 smallint comment '4', col_int int comment 'Field type is INT and lowercase', col_int_02 int comment '6', col_bigint bigint comment 'Field type is BIGINT and lowercase', col_bigint_02 bigint comment '8', col_float float comment 'Field type is FLOAT and lowercase', col_float_02 float comment '10', col_double double comment 'Field type is DOUBLE and lowercase', col_double_02 double comment '11', col_decimal decimal(38,18) comment 'Field type is DECIMAL(38,18) and lowercase', col_decimal_02 decimal(38,18) comment '12', col_varchar varchar(500) comment 'Field type is VARCHAR(500) and lowercase', col_varchar_02 varchar(500) comment '13', col_char char(10) comment 'Field type is CHAR(10) and lowercase', col_char_02 char(10) comment '14', col_string string comment 'Field type is STRING and lowercase', col_string_02 string comment '15', col_date date comment 'Field type is DATE and lowercase', col_date_02 date comment '16', col_datetime datetime comment 'Field type is DATETIME and lowercase', col_datetime_02 datetime comment '17', col_timestmap timestamp comment 'Field type is TIMESTAMP and lowercase', col_timestmap_02 timestamp comment '18', col_boolean boolean comment 'Field type is BOOLEAN and lowercase', col_boolean_02 boolean comment '19', col_binary binary comment 'Field type is BINARY and lowercase', col_binary_02 binary comment '20', col_array array<int> comment 'Field type is ARRAY<int> and lowercase', col_array_02 array<string> comment '21', col_map map<string,string> comment 'Field type is MAP<string, string> and lowercase', col_map_02 map<string,int> comment '22', ds string comment 'Date partition, yyyyMMdd' ) partitioned by (dataphin_quality_validate_date string comment 'Validation Date (Partition Field)');
Select Existing Table: Choose a table from the same project or data source. The archive table must include all fields and validation fields from the quality monitoring table. You can click View Exception Archive Table DDL to view the table creation statement. The script format is as follows:
create table current_table_name_exception_data (dataphin_quality_tenant_id varchar(64) comment 'Tenant ID' , dataphin_quality_rule_id varchar(64) comment 'Quality Rule ID', dataphin_quality_rule_name varchar(256) comment 'Quality Rule Name', dataphin_quality_column_name varchar(1024) comment 'Validation Field Name', dataphin_quality_watch_task_id varchar(128) comment 'Monitored Object Node ID', dataphin_quality_rule_task_id varchar(64) comment 'Rule Node ID', dataphin_quality_validate_time varchar(64) comment 'Quality Validation Time', dataphin_quality_archive_mode varchar(32) comment 'Abnormal Data Archiving Mode, ONLY_ERROR_FIELD/FULL_RECORD', dataphin_quality_error_data string comment 'Abnormal Data', ljba_id bigint comment 'ljba Primary Key', ljb_id bigint comment 'ljb Primary Key', col_tinyint tinyint comment 'Field type is TINYINT and lowercase', col_tinyint_02 tinyint comment '2', col_smallint smallint comment 'Field type is SMALLINT and lowercase', col_smallint_02 smallint comment '4', col_int int comment 'Field type is INT and lowercase', col_int_02 int comment '6', col_bigint bigint comment 'Field type is BIGINT and lowercase', col_bigint_02 bigint comment '8', col_float float comment 'Field type is FLOAT and lowercase', col_float_02 float comment '10', col_double double comment 'Field type is DOUBLE and lowercase', col_double_02 double comment '11', col_decimal decimal(38,18) comment 'Field type is DECIMAL(38,18) and lowercase', col_decimal_02 decimal(38,18) comment '12', col_varchar varchar(500) comment 'Field type is VARCHAR(500) and lowercase', col_varchar_02 varchar(500) comment '13', col_char char(10) comment 'Field type is CHAR(10) and lowercase', col_char_02 char(10) comment '14', col_string string comment 'Field type is STRING and lowercase', col_string_02 string comment '15', col_date date comment 'Field type is DATE and lowercase', col_date_02 date comment '16', col_datetime datetime comment 'Field type is DATETIME and lowercase', col_datetime_02 datetime comment '17', col_timestmap timestamp comment 'Field type is TIMESTAMP and lowercase', col_timestmap_02 timestamp comment '18', col_boolean boolean comment 'Field type is BOOLEAN and lowercase', col_boolean_02 boolean comment '19', col_binary binary comment 'Field type is BINARY and lowercase', col_binary_02 binary comment '20', col_array array<int> comment 'Field type is ARRAY<int> and lowercase', col_array_02 array<string> comment '21', col_map map<string,string> comment 'Field type is MAP<string, string> and lowercase', col_map_02 map<string,int> comment '22', ds string comment 'Date partition, yyyyMMdd' ) partitioned by (dataphin_quality_validate_date string comment 'Validation Date (Partition Field)');
Click OK to add the exception archive table.
You can select Automatically Set As Effective Archived Table After Creation to automatically use the new table as the active archive table for new quality rules.
View the list of exception archive tables
After a table is added, the first one becomes the default active archive table. You can click the name of an exception archive table to view its schema information. You can also set other tables as the active archive table or delete them.
Set As Effective Archive Table: If you set a table as the effective archive table, it becomes the destination for abnormal data from all existing quality rules for this monitored object whose archive location is set to Custom Exception Archive Table.
Delete: This action deletes only the reference to the archive table, not the table itself. You can add the reference again if needed.
View a quality report
Click Quality Report to view the Rule Validation Overview and Rule Validation Details for the current quality rules.
You can quickly filter the check details by exception result, partition time, or rule or object name keyword.
In the Actions column of the rule check details list, you can click the
icon to view the check details of the quality rule.In the Actions column of the rule check details list, you can click the
icon to view the execution log of the quality rule.
Set quality rule permissions
Click Permission Management to configure the View Details permission. This permission allows specified members to view verification record details, quality rule details, and quality reports.
View Details: You can select All Members or Only Members With Current Object Quality Management Permission.
Click OK to save the permission settings.
What to do next
After you configure the quality rules, you can manage and view them on the Dataphin table rule list page. For more information, see View Monitored Object List.
