All Products
Search
Document Center

Dataphin:Create global data table quality rules

Last Updated:Sep 30, 2025

Dataphin lets you create quality rules to validate data tables, which simplifies quality monitoring. This topic describes how to configure quality rules for global data tables.

Prerequisites

You must add monitored objects before you can configure quality rules. For more information, see Add monitored objects.

Permissions

  • Super administrators, quality administrators, and users in custom global roles with Quality Rule-Management permissions can configure settings for quality rules, such as schedules, alerts, exception archive tables, and scoring weights.

  • Quality owners can configure schedules, alerts, exception archive tables, and scoring weights for quality rules for the monitored objects they are responsible for.

  • Quality owners and regular users require read-through permissions for the data source where the global data table is located. For instructions on how to apply, see Apply for data source permissions.

  • The supported operation permissions vary by object. For more information, see Quality rule operation permissions.

Validation rule description

When a data table is validated against a quality rule, the system's response depends on the rule type. If a weak rule is triggered, the system sends an alert message. If a strong rule is triggered, the system automatically stops the task that contains the table to prevent dirty data from flowing downstream and also sends an alert message. In both cases, the alert helps you promptly identify and handle exceptions.

Difference between a trial run and a regular run

A trial run and a regular run differ in their execution method and results. A trial run is a simulated execution of a quality rule used to verify its correctness and performance. The results of a trial run are not included in the quality report. A run executes the quality rule at a scheduled time, and the results are sent to the quality report for you to view and analyze.

Configure a quality rule

  1. From the top menu bar on the Dataphin home page, select Administration > Data Quality.

  2. In the left navigation pane, click Quality Rule. On the Global Data Table page, click the name of the target object to navigate to the Quality Rule Details page, where you can configure the quality rule.

  3. On the Quality Rule Details page, click Create Quality Rule.

  4. In the Create Quality Rule dialog box, you can configure the parameters.

    Parameter

    Description

    Basic information

    Rule Name

    Custom name for the quality rule, up to 256 characters.

    Rule strength

    Supports Weak Rule and Strong Rule.

    • Weak Rule: If you select Weak Rule, an alert is triggered if a quality rule check is abnormal, but it does not block downstream task nodes.

    • Strong Rule: If you select Strong Rule, an alert is triggered if a quality rule check is abnormal. If downstream tasks exist, such as in code check scheduling and task trigger scheduling, it blocks these tasks to prevent the spread of contaminated data. If no downstream tasks exist, such as in periodic quality scheduling, only an alert is triggered.

    Description

    A custom description for the quality rule. The maximum length is 128 characters.

    Configuration method

    • Template Creation: Use general system templates and custom business templates to quickly create quality rules.

      • System Template: This template has configurable built-in parameters for creating general rules.

      • Custom Template: This template has preset parameters that require no configuration. It is typically used to create rules that contain business logic.

    • SQL: Use SQL to customize quality monitoring rules for flexible and complex scenarios.

    Rule template

    Select a rule template from the drop-down list. The templates include Integrity, Uniqueness, Timeliness, Validity, Consistency, Stability, and Custom SQL.

    • Integrity: Includes Field Null Check and Field Empty String Check.

    • Uniqueness: Includes Field Uniqueness Check, Field Group Count Check, and Field Duplicate Count Check.

    • Timeliness: Includes Time Function Comparison, Single Table Time Field Comparison, and Two Table Time Field Comparison.

    • Validity: Includes Field Format Check, Field Length Check, Field Range Check, Lookup Table Reference Comparison, and Data Standard Lookup Table Reference Comparison (requires the Data Standard module).

    • Consistency: Includes Single Table Field Value Consistency Comparison, Single Table Field Statistical Value Consistency Comparison, Single Field Business Logic Consistency Comparison, Two Table Field Value Consistency Comparison, Two Table Field Statistical Value Consistency Comparison, Two Table Field Business Logic Consistency Comparison, and Cross-Source Two Table Field Statistical Value Consistency Comparison.

    • Stability: Includes Table Stability Check, Table Volatility Check, Field Stability Check, and Field Volatility Check.

    • Custom SQL: Includes Custom Statistical Indicator Check and Custom Data Detail Check.

    For more information, see Template type description.

    Rule Type

    The rule type is a basic property of the template, used to identify and filter features.

    Monitoring Granularity

    The Custom SQL configuration type lets you set the monitoring granularity to monitor the Full Table or specific Fields.

    Template Configuration

    Template Information

    The configuration for the selected quality rule template is displayed. To modify it, go to Quality Rule Template.

    Rule configuration

    Rule Configuration

    Rule configurations vary based on the selected rule template. For more information, see Data Table Parameter Configuration.

    • Data Filtering For Validation Table: Disabled by default. When enabled, you can configure filter conditions, partition filtering, or regular data filtering for the validation table. The filter conditions are directly appended to the validation SQL. If the validation table requires partition filtering, configure the partition filter expression in the schedule configuration. After configuration, the validation partition becomes the minimum granularity for viewing the quality report.

    • If you select the Consistency/Two Table Field Statistical Value Consistency Comparison or Consistency/Cross-Source Two Table Field Statistical Value Consistency Comparison rule template, you can enable Data Filtering For Comparison Table. When enabled, you can configure filter conditions, partition filtering, or regular data filtering for the comparison table. The filter conditions are directly appended to the validation SQL.

    Validation configuration

    Rule validation

    • After a Data Quality rule is checked, the result is compared with the abnormal check configuration. If the conditions are met, the check fails. This also triggers alerts and other subsequent flows.

    • The available metrics for abnormal checks are determined by the template and configuration. These checks support AND/OR logic for multiple conditions. Use fewer than three conditions in a single configuration.

    For more information, see Validation Configuration.

    Archive configuration

    Exception archive

    This feature is disabled by default. When enabled, it archives abnormal data to a file or table. After a quality check, download and analyze the archived abnormal data.

    • Archive Mode supports Archive Only Abnormal Fields and Archive Complete Records.

      • Archive Only Abnormal Fields: Removes duplicates and archives only the data from the monitored field. Use this mode when a single field is enough to identify the abnormal data.

      • Archive Complete Records: Archives the entire record that contains the abnormal data. Use this mode when the complete record is required to locate the abnormal data. Note: Archiving complete records significantly increases the amount of archived data. Archive only abnormal fields in most cases.

    • Archive Location supports Default File Server and Archive Table For Exception Data. If an exception archive table has not been created, click Manage Exception Archive Table to create one. For more information, see Add an exception archive table.

      • Default File Server: This is the system file server configured during Dataphin deployment. Download the abnormal data directly from the Validation Records > Validation Details page, or access the default file server to retrieve the data. When you use the default file server, a maximum of 100 abnormal data records are archived for each validation. This option is suitable for scenarios that involve validating small amounts of data.

      • Archive Table For Exception Data: If you want to store more abnormal data or consolidate abnormal data from different validation records for later analysis, specify a custom archive table. Each quality rule can record a maximum of 10,000 abnormal data records per run. In addition to downloading the abnormal data from a single validation on the Validation Records page, you also have the flexibility to access the archive table directly and customize its lifecycle.

        Note
        • A consolidated report of abnormal data from all rules in the current run is available for download. The download is limited to 1,000 records. To view more data, archive it to the specified exception archive table and then access the table directly.

        • The exception archive table must meet specific format requirements. Otherwise, errors may occur when writing data, which can affect its use. For more information, see Add an exception archive table.

    Business property configuration

    Property information

    The entry requirements for business properties depend on the configuration of quality rule properties. For example, the value type for the department in charge field is a multi-select enumeration. The available values are Big Data Department, Business Department, and Technical Department. Therefore, when you create a quality rule, this property appears as a multi-select drop-down list with these options.

    The value type for the rule owner field is custom input, with a maximum length of 256 characters. Therefore, when you create a quality rule, you can enter up to 256 characters for this property.

    If the input method for the property field is Range Interval, configure it as follows:

    Range Interval: Use this for a continuous range of numbers or dates. You can choose from four symbols: >, >=, <, and <=. For more information about property configurations, see Create and manage quality rule properties.

    Scheduling property configuration

    Scheduling Method

    Select a configured schedule. If you have not decided on a schedule, you can configure one after you create the quality rule. To create a new schedule, see Create Schedule.

    Quality Score Configuration

    Scoring Method

    Two scoring methods are supported: quality validation status and data compliance ratio.

    • Quality Validation Status: The score is based on the result of the most recent successful validation for a rule. A passed validation scores 100 points, and a failed validation scores 0 points.

    • Data Compliance Ratio: The score is the percentage of normal data (the normal rate) from the most recent successful validation. For example, if the data format validity is 80%, the quality score is 80 points.

    Different rule templates support different scoring methods. The following rule types support only the Quality Validation Status scoring method:

    • Field Group Count Check and Field Duplicate Count Check in the Uniqueness rule category.

    • Single Table Field Statistical Value Consistency Comparison and Cross-Source Two Table Field Statistical Value Consistency Comparison in the Consistency rule category.

    • Stability rule category.

    • Custom Statistical Indicator Check in the Custom SQL rule category.

    Quality Score Weight

    The weight of the quality rule contributes to the quality score of the monitored object. The value must be an integer from 1 to 10.

  5. Click Confirm to save the rule configuration.

    You can click Preview SQL to compare the current configuration with the last saved version and identify any SQL changes.

    Note
    • The Preview SQL feature is unavailable if key information is missing.

    • The left side shows the SQL preview of the last saved configuration. If no configuration was saved, this area is empty. The right side shows the SQL preview of the current configuration.

Rule configuration list

On the rule configuration list page, you can view information about configured data table rules. You can also perform operations such as viewing, editing, trial running, running, and deleting them.

image

Region

Description

Filter and Search Area

Quickly search by object or rule name.

Filter by rule type, rule template, rule strength, trial run status, and effective status.

Note

If a quality rule property is configured with searchable and filterable business properties and is enabled, you can search or filter based on that property.

List area

This list shows the object type/name, rule name/ID, trial run status, effective status, rule type, rule template, rule strength, scheduling type, and related knowledge base document information for each rule. Click the image icon to select the fields to display in the list.

  • Effective Status: Perform a trial run on a rule before you enable it. To prevent incorrect rules from blocking online tasks, enable the rule only after it passes the trial run.

    • After you enable a rule, it automatically runs on its configured schedule.

    • After you disable a rule, it no longer runs automatically, but you can run it manually.

  • Related Knowledge Base Document: Click View Details to view the knowledge base information associated with the rule. This information includes the table name, validation object, rule, and related knowledge base documents. You can also search for, view, edit, and delete items in the knowledge base. For more information, see View Knowledge Base.

Operation area

You can perform the following operations:

  • View: View rule configuration details.

  • Clone: Quickly clone rules.

  • Edit: After you edit a rule, run a trial run again. For rules that reference a data table, you can only modify the rule name and strength.

  • Trial Run: Select an Existing Schedule or a Custom Validation Range to run a trial run of the rule. After the trial run, click the image icon to View Trial Run Log.

  • Run: Select an Existing Schedule or a Custom Validation Range to run the rule. After the run, view the validation results in Quality Record.

  • Scan Configuration: In the dialog box, filter schedules by type or use a schedule name for a quick search. You can also edit schedules.

  • Associate Knowledge Base Document: After you associate a rule with a knowledge base document, you can view the document in the Quality Rules and Administration workbenches. You can select a knowledge base that is not already associated. To create a knowledge base, see Create and manage a knowledge base.

  • Quality Score Configuration: You can modify the scoring method and quality score weight of the quality specification.

    Important

    The quality score weight of a quality rule is used to calculate the quality score of the monitored object. Modifying this weight affects the quality score results. Proceed with caution.

  • Delete: Deleting this quality rule object deletes all quality rules under it. This action cannot be undone. Proceed with caution.

Batch Operations Area

You can perform the following operations in batches: trial run, execute, configure schedule, enable, disable, modify business properties, associate knowledge base documents, configure quality scores, export rules, and delete.

  • Trial Run: Select Existing Schedule or Custom Validation Range to run a trial of rules in a batch. After the trial run, click the image icon to View Trial Run Log.

  • Run: Select Existing Schedule or Custom Validation Range to execute rules in a batch. After the execution, view the validation results in Quality Record.

    Note

    For batch executions, select tables that use the same partition. The system directly uses the partition information for execution. Mismatched partitions may cause errors.

  • Configure Schedule: In the dialog box, filter schedules by type or use a schedule name to perform a quick search. Edit schedules to configure them for quality rules in a batch. You can modify only the selected rules that are editable on the quality rules list page.

  • Enable: After you enable the selected rules in a batch, the rules automatically run based on their configured schedules. You can enable only the selected rules that are editable on the quality rules list page.

  • Disable: After you disable the selected rules in a batch, the rules do not run automatically, but you can run them manually. You can disable only the selected rules that are editable on the quality rules list page.

  • Modify Business Properties: Modify business properties in a batch if the value type of the corresponding field is single-select or multi-select.

    • If the value type is multi-select, append or modify property values.

    • If the value type is single-select, directly modify property values.

  • Associate Knowledge Base Document: After you associate a rule with a knowledge base, view the associated knowledge in the quality rule and administration workbench. Configure knowledge bases for monitored objects in a batch. To create a knowledge base, see Create and manage a knowledge base.

  • Configure Quality Score: Modify the scoring method and quality score weight of quality specifications in a batch.

    Important

    The quality score weight of a quality rule is used to calculate the quality score of the monitored object. Changes to the weight affect the quality score results. Proceed with caution.

  • Export Rules: Export selected custom SQL quality rules from the current monitored object. This operation requires view permissions for the selected rules.

  • Delete: Delete quality rule objects in a batch. This action cannot be undone. Proceed with caution. You can delete only the rules for which you have edit permissions.

Create a schedule

Note
  • When you configure a schedule for a rule, you can base it on an existing schedule. Each table can have a maximum of 20 scheduling rules.

  • A single rule can have a maximum of 10 schedules.

  • If schedule configurations are identical, they are automatically deduplicated.

  • The validation range is passed as a filter condition to the quality validation statement to control the scope of each validation. This range also serves as the basic unit for downstream components, such as quality reports, where it is the minimum viewing granularity.

  1. On the Quality Rule Details page, on the Scan Configuration tab, click the Create Scheduling button to open the Create Scheduling dialog box.

  2. In the Create Scheduling dialog box, configure the parameters.

    Parameter

    Description

    Schedule Name

    Enter a name for the schedule. The name can be up to 64 characters long.

    Scheduling Type

    Supports Recurrency Triggered and Task Triggered.

    • Recurrency Triggered: Runs quality checks on data at scheduled times or intervals. This method is ideal for scenarios where data generation times are consistent.

      Recurrence: Running quality rules consumes computing resources. Avoid running multiple quality rules concurrently to prevent impacts on your production tasks. The recurrence types are Day, Week, Month, Hour, and Minute.

      If the system time zone (in the User Center) differs from the scheduling time zone (configured in Management Hub > System Settings > Basic Settings), the rules run based on the system time zone.

    • Task Triggered: Runs the configured quality rule before or after a specified task runs successfully. The rule can be triggered by task nodes of types such as DPI engine SQL, offline pipeline, Python, Shell, Virtual, Datax, Spark_jar, Hive_MR, and database SQL. This method is ideal when table modification tasks are consistent.

      Note

      Task triggers can only be set for tasks in the production environment. If you configure a strong rule and the scheduled task validation fails, it may affect your online tasks. Proceed with caution, as needed.

      • Trigger Timing: Select when to trigger the quality check. The options are Trigger After All Tasks Run Successfully, Trigger After Each Task Runs Successfully, and Trigger Before Each Task Runs.

      • Triggering Task: Project administrators or users with the O&M system role can select a task node in the production project. Search for the node by its output name, or select it from the list of recommended or all tasks.

        Note

        If you set Trigger Timing to Trigger After All Tasks Run Successfully, select triggering tasks that have the same scheduling cycle. This prevents rule execution delays and ensures that quality check results are generated promptly.

    Schedule Condition

    This is disabled by default. If you enable this feature, the system checks if the scheduling conditions are met before the quality rule is scheduled. The rule is scheduled only if the conditions are met. If the conditions are not met, the scheduled run is skipped.

    • Data Timestamp/Executed On: If you set **Schedule Type** to Recurrency Triggered (the **Executed On** option is not supported) or Task Triggered, you can configure the date by selecting Regular Calendar or Custom Calendar. For more information about how to create a custom calendar, see Create a public calendar.

      • If you select Regular Calendar, the available conditions are Month, Week, and Date. For example:

        image

      • If you select Custom Calendar, the available conditions are Date Type and Tag. For example:

        image

    • Instance Type: If you set **Schedule Type** to Task Triggered, you can configure the instance type by selecting Recurring Instance, Data Backfill Instance, or One-time Instance. For example:

      image

    Note
    • Configure at least one rule. To add a rule, click the + Add Rule button.

    • You can configure a maximum of 10 scheduling conditions.

    • The relationship between scheduling conditions can be set to AND or OR.

    Validation Range Expression

    This is an editable drop-down list. Enter the range to check, such as ds='${yyyyMMdd}'. To speed up configuration, select and modify a built-in partition filter expression. For more information about partition filter expressions, see Built-in partition filter expression types.

    Note
    • If multiple conditions are required, use and or or to connect them. For example, province="Zhejiang" and ds<=${yyyyMMdd}.

    • If a filter condition is configured in the quality rule, the partition filter expression and the filter condition are combined with an AND operator. When data is checked, both conditions are used for filtering.

    • The partition filter expression supports full table scans.

      Note: A full table scan consumes significant resources and is not always supported. Configure a partition filter expression to avoid a full table scan.

    Validation Range Budget

    The default is the data timestamp for the current day.

  3. Click Confirm to save the scheduling configuration.

Scheduling configuration list

After a schedule is created, you can view, edit, clone, and delete it in the scheduling configuration list.

image.png

Area

Description

Filter and search area

Supports quick search by schedule name.

Supports filtering by Recurrency Triggered and Task Triggered.

List area

Displays the Schedule Name, Schedule Type, Last Updated By, and Last Updated Time information of the rule configuration list.

Operation area

You can edit, clone, and delete schedules.

  • Edit: Modify the configured schedule information.

    Important

    All rule configurations that reference this schedule are updated automatically. Proceed with caution.

  • Clone: Quickly copy a schedule configuration.

  • Delete: A schedule cannot be deleted if it is referenced by a rule configuration.

Alert configuration

You can configure different alert methods for different rules to customize alerts. For example, you can configure phone alerts for strong rule exceptions and text message alerts for weak rule exceptions. If a rule matches multiple alert configurations, you can set an alert policy to determine which configuration to apply.

Note

A single monitored object can have a maximum of 20 alert configurations.

  1. On the Quality Rule Details page, on the Alert Configuration tab, click the Create Alert Configuration button to open the Create Alert Configuration dialog box.

  2. In the Create Alert Configuration dialog box, set the parameters.

    Parameter

    Description

    Coverage

    Lets you select All Rules, All Strong Rules, All Soft Rules, or Custom.

    Note
    • For a single monitored object, you can configure one alert for each of the following scopes: All Rules, All Strong Rules, and All Soft Rules. New rules are automatically matched to the corresponding alert based on their strength. To change an alert configuration, modify the existing one.

    • The Custom scope lets you select up to 200 configured rules for the current monitored object.

    Alert Configuration Name

    The alert configuration name under a single monitored object must be unique and not exceed 256 characters.

    Alert Recipients

    Configure alert recipients and alert methods. You must select at least one alert recipient and one alert method.

    • Alert Recipients: You can select custom recipients, shift schedules, or quality owners.

      You can configure up to 5 custom alert recipients and 3 shift schedules.

    • Alert Method: You can select an alert method, such as phone, email, text message, DingTalk, Lark, WeCom, or custom channels. You can manage these methods in Configure Channel Settings.

  3. Click Confirm to complete the alert configuration.

Alert configuration list

After the alert configuration is complete, you can sort, edit, and delete configurations in the alert configuration list.

image.png

Number

Description

① Sorting area

Configure the alert policy that applies when a quality rule matches multiple alert configurations:

  • First Matched Alert Configuration Takes Effect: When this policy is selected, only the first alert configuration that a rule matches takes effect. All other configurations are ignored. You can sort the configurations to set their priority. Click Rule Sort. To change the order, drag the image.png icon next to a configuration name, or use the icons in the Operations column to move it to the top or bottom of the list. After sorting, click Sort Complete.

    image.png

  • All Alert Configurations Take Effect: All alert configurations in the list apply to the quality rules for the current monitored object. For example, if you configure multiple alert configurations and select this option, the system merges alerts based on the notification method, recipient, and alert rule. Specifically, if the recipient is the same person and the notification methods are Custom and Quality Owner, the system merges the alert messages based on the merge policy.

    For example, when you configure multiple alert configurations and set them all to take effect, the system merges alerts based on the alert notification method, alert recipient, and alert rule. Specifically, if the alert recipient is the same person, and the alert notification methods are Custom and Quality Owner, the alert messages are merged based on the merge policy.

    Note

    Shift schedules do not support alert merging.

② List area

Displays the name, effective range, specific recipients for each alert type, and the corresponding alert receiving method of the alert configuration.

Effective Range: For custom alerts, you can view the configured object name and rule name. If a rule is deleted, you cannot view the object name. Update the alert configuration.

③ Operation area

You can edit or delete configured alerts.

  • Edit: Modifies the configured alert information. If you change the alert recipient or method, notify the relevant personnel promptly to avoid missing important business alerts.

  • Delete: After an alert is deleted, the rules that trigger the alert no longer apply. Proceed with caution.

Add an exception archive table

The exception archive table stores records of quality rule validation exceptions.

  1. On the Quality Rule Details page, on the Archive tab, click the +Add Exception Archive Table button to open the Add Exception Archive Table dialog box.

  2. In the Add Exception Archive Table dialog box, set the parameters.

    The Addition Method parameter has two options: Create New Table or Select Existing Table. It includes quality validation fields to prevent invalid data from being written to the source data table.

    • Create New Table: You can create a new table with a custom name in the project or section that contains the archived table. The default name is current_table_name_exception_data. The new table is created in the same database or data source. The name can include letters, numbers, underscores (_), and periods (.), and must not exceed 128 characters.

      • If the monitored table is a physical table, the archive table is created in the project where the monitored table is located.

      • If the monitored table is a logical dimension table or a logical fact table, the archive table is created by default in the project where the monitored table is located. You can also manually specify a project under the section of the monitored table, such as projectA.table_name.

      • If the monitored table is a logical aggregate table, you must specify a project name for the archive table under the same section. Otherwise, the table is automatically archived to a project under the section where the monitored table is located.

      • The archive table must contain all fields from the monitored table and the validation fields. The script format is as follows:

        create table current_table_name_exception_data
         (dataphin_quality_tenant_id      varchar(64)   comment 'Tenant ID' , 
          dataphin_quality_rule_id        varchar(64)   comment 'Quality Rule ID', 
          dataphin_quality_rule_name      varchar(256)  comment 'Quality Rule Name', 
          dataphin_quality_column_name    varchar(1024) comment 'Validation Field Name', 
          dataphin_quality_watch_task_id  varchar(128)  comment 'Monitored Object Task ID', 
          dataphin_quality_rule_task_id   varchar(64)   comment 'Rule Task ID', 
          dataphin_quality_validate_time  varchar(64)   comment 'Quality Validation Time', 
          dataphin_quality_archive_mode   varchar(32)   comment 'Exception Archive Mode, ONLY_ERROR_FIELD/FULL_RECORD', 
          dataphin_quality_error_data     string        comment 'Exception Data', 
          ljba_id                         bigint        comment  'ljba Primary Key', 
          ljb_id                          bigint        comment  'ljb Primary Key', 
          col_tinyint                     tinyint       comment 'Field type is TINYINT and lowercase',
          col_tinyint_02                  tinyint       comment '2',
          col_smallint                    smallint      comment 'Field type is SMALLINT and lowercase',
          col_smallint_02                 smallint      comment '4',
          col_int                         int           comment 'Field type is INT and lowercase',
          col_int_02                      int           comment '6',
          col_bigint                      bigint        comment 'Field type is BIGINT and lowercase',
          col_bigint_02                   bigint        comment '8',
          col_float                       float         comment 'Field type is FLOAT and lowercase',
          col_float_02                    float         comment '10',
          col_double                      double        comment 'Field type is DOUBLE and lowercase',
          col_double_02                   double        comment '11',
          col_decimal                     decimal(38,18) comment 'Field type is DECIMAL(38,18) and lowercase',
          col_decimal_02                  decimal(38,18) comment '12',
          col_varchar                     varchar(500)   comment 'Field type is VARCHAR(500) and lowercase',
          col_varchar_02                  varchar(500)   comment '13',
          col_char                        char(10)       comment 'Field type is CHAR(10) and lowercase',
          col_char_02                     char(10)       comment '14',
          col_string                      string         comment 'Field type is STRING and lowercase',
          col_string_02                   string         comment '15',
          col_date                        date           comment 'Field type is DATE and lowercase',
          col_date_02                     date           comment '16',
          col_datetime                    datetime       comment 'Field type is DATETIME and lowercase',
          col_datetime_02                 datetime       comment '17',
          col_timestmap                   timestamp      comment 'Field type is TIMESTAMP and lowercase',
          col_timestmap_02                timestamp      comment '18',
          col_boolean                     boolean        comment 'Field type is BOOLEAN and lowercase',
          col_boolean_02                  boolean        comment '19',
          col_binary                      binary         comment 'Field type is BINARY and lowercase',
          col_binary_02                   binary         comment '20',
          col_array                       array<int>     comment 'Field type is ARRAY<int> and lowercase',
          col_array_02                    array<string>  comment '21',
          col_map                         map<string,string>  comment 'Field type is MAP<string, string> and lowercase',
          col_map_02                      map<string,int>     comment '22',
          ds                              string              comment 'Date partition, yyyyMMdd'
         ) 
        partitioned by 
        (dataphin_quality_validate_date string comment 'Validation Date (Partition Field)');
    • Select Existing Table: You can select tables from the same project or data source. The archive table must include all fields and validation fields required by the quality monitoring table. You can click View Exception Archive Table DDL to see the table creation statement. The script is formatted as follows:

      create table current_table_name_exception_data
       (dataphin_quality_tenant_id      varchar(64)   comment 'Tenant ID' , 
        dataphin_quality_rule_id        varchar(64)   comment 'Quality Rule ID', 
        dataphin_quality_rule_name      varchar(256)  comment 'Quality Rule Name', 
        dataphin_quality_column_name    varchar(1024) comment 'Validation Field Name', 
        dataphin_quality_watch_task_id  varchar(128)  comment 'Monitored Object Task ID', 
        dataphin_quality_rule_task_id   varchar(64)   comment 'Rule Task ID', 
        dataphin_quality_validate_time  varchar(64)   comment 'Quality Validation Time', 
        dataphin_quality_archive_mode   varchar(32)   comment 'Exception Archive Mode, ONLY_ERROR_FIELD/FULL_RECORD', 
        dataphin_quality_error_data     string        comment 'Exception Data', 
        ljba_id                         bigint        comment  'ljba Primary Key', 
        ljb_id                          bigint        comment  'ljb Primary Key', 
        col_tinyint                     tinyint       comment 'Field type is TINYINT and lowercase',
        col_tinyint_02                  tinyint       comment '2',
        col_smallint                    smallint      comment 'Field type is SMALLINT and lowercase',
        col_smallint_02                 smallint      comment '4',
        col_int                         int           comment 'Field type is INT and lowercase',
        col_int_02                      int           comment '6',
        col_bigint                      bigint        comment 'Field type is BIGINT and lowercase',
        col_bigint_02                   bigint        comment '8',
        col_float                       float         comment 'Field type is FLOAT and lowercase',
        col_float_02                    float         comment '10',
        col_double                      double        comment 'Field type is DOUBLE and lowercase',
        col_double_02                   double        comment '11',
        col_decimal                     decimal(38,18) comment 'Field type is DECIMAL(38,18) and lowercase',
        col_decimal_02                  decimal(38,18) comment '12',
        col_varchar                     varchar(500)   comment 'Field type is VARCHAR(500) and lowercase',
        col_varchar_02                  varchar(500)   comment '13',
        col_char                        char(10)       comment 'Field type is CHAR(10) and lowercase',
        col_char_02                     char(10)       comment '14',
        col_string                      string         comment 'Field type is STRING and lowercase',
        col_string_02                   string         comment '15',
        col_date                        date           comment 'Field type is DATE and lowercase',
        col_date_02                     date           comment '16',
        col_datetime                    datetime       comment 'Field type is DATETIME and lowercase',
        col_datetime_02                 datetime       comment '17',
        col_timestmap                   timestamp      comment 'Field type is TIMESTAMP and lowercase',
        col_timestmap_02                timestamp      comment '18',
        col_boolean                     boolean        comment 'Field type is BOOLEAN and lowercase',
        col_boolean_02                  boolean        comment '19',
        col_binary                      binary         comment 'Field type is BINARY and lowercase',
        col_binary_02                   binary         comment '20',
        col_array                       array<int>     comment 'Field type is ARRAY<int> and lowercase',
        col_array_02                    array<string>  comment '21',
        col_map                         map<string,string>  comment 'Field type is MAP<string, string> and lowercase',
        col_map_02                      map<string,int>     comment '22',
        ds                              string              comment 'Date partition, yyyyMMdd'
       ) 
      partitioned by 
      (dataphin_quality_validate_date string comment 'Validation Date (partition field)');
  3. Click Confirm to add the abnormal archived table.

    If you select Automatically Set As Effective Archive Table After Creation, the archived table is automatically selected when you create quality rules.

View the exception archive table list

After a table is created, the first one becomes the default effective archive table. You can click the name of an exception archive table to view its schema. You can also set other archive tables as the effective archive table or delete them.

  • Set As Effective Archive Table: If this table is set as the effective archive table, exception data is archived to this table from all quality rules for the monitored object whose archive location is set to a custom exception archive table.

  • Delete: This action only removes the reference to the exception archive table without deleting the table itself. You can re-establish the reference if necessary.

View the quality report

Click Quality Report to view the Rule Validation Overview and Rule Validation Details of the current quality rule.

  • You can filter validation details by exception result, partition time, or a keyword in the rule or object name.

  • In the Actions column of the rule validation details list, click the image icon to view the validation details of the quality rule.

  • In the Actions column of the rule validation details list, click the image icon to view the execution log of the quality rule.

Configure quality rule permissions

  1. Click Permission Management and configure the View Details permission to specify which members can view validation record details, quality rule details, and quality reports.

    For View Details, you can select either All Members or Only Members With Quality Management Permissions For The Current Object.

  2. Click Confirm to finalize the configuration for permission management.

What to do next

After you configure the quality rule, you can view and manage it from the monitored object list. For more information, see View monitored object list.