All Products
Search
Document Center

PolarDB:ALTER TABLE ... ENABLE/DISABLE CONSTRAINT

Last Updated:Sep 17, 2025

ENABLE/DISABLE CONSTRAINT in PolarDB for PostgreSQL (Compatible with Oracle) provides a flexible way to manage table constraints without dropping their definitions. By temporarily disable constraint checks, you can safely and efficiently perform performance-intensive operations like data import, data repair, and constraint deployment.

Overview

When you add or manage constraints on large tables, operations like validation and indexing often require full table scans and long-lasting locks, which can affect business performance and availability. ENABLE/DISABLE CONSTRAINT optimizes database performance by modifying the active state of CHECKFOREIGN KEYPRIMARY KEY, and UNIQUE constraints.

Two primary commands, DISABLE CONSTRAINT and ENABLE CONSTRAINT are available:

  • DISABLE CONSTRAINT: Disables a constraint. No validation on new writes (INSERT and UPDATE).

  • ENABLE CONSTRAINT: Enables a constraint with the following two validation modes:

    • ENABLE CONSTRAINT (Default): Validates new data and scans all existing data. For large tables, this process can be time-consuming and consume significant I/O resources.

    • ENABLE CONSTRAINT ... NOT VALID: Validates only new data, skipping the check on existing data for a fast operation. Use this mode when you are confident that existing data is already valid.

Prerequisites

Before you begin, ensure that your cluster version is PolarDB for PostgreSQL (Compatible with Oracle) with revision version 2.0.11.9.11.0 or later.

Note

You can check your version in the console or by running SHOW polardb_version;. If needed, upgrade the minor engine version.

Dependencies

  • Extensions: This feature depends on the polar_constraint extension. You must install and configure it as instructed in ALTER TABLE ... ENABLE/DISABLE CONSTRAINT.

  • Constraints:

    • Disabling referenced constraints:

      You cannot disable a PRIMARY KEY or UNIQUE constraint if it is being referenced by an active FOREIGN KEY. The referencing foreign keys must be disabled first.

    • Enabling foreign key:

      You can enable a FOREIGN KEY only if the referenced PRIMARY KEY or UNIQUE constraint is enabled.

  • Compatibility:

    • This is a PolarDB-specific feature. Native PostgreSQL does not support disabling PRIMARY KEY and UNIQUE constraints.

    • The syntax is based on PostgreSQL and differs from Oracle syntax such as ENABLE VALIDATE and ENABLE NOVALIDATE.

Install the extension

Before you use this feature, you must install the polar_constraint extension for your cluster.

  1. Run the following command to check if the extension is installed:

    SELECT extname, extversion FROM pg_extension WHERE extname = 'polar_constraint';

    If the extension and its version information are returned, the extension is installed.

  2. (Optional) If the extension is not installed, follow these steps:

    1. (Optional) For clusters with a revision version earlier than 2.0.11.9.25.0: Go to the Configuration and Management > Parameter Configuration page in the PolarDB console and modify the shared_preload_libraries cluster parameter by adding polar_constraint.

      Note
      • If the parameter already contains other extensions, use a comma to separate them. For example: pg_stat_statements,polar_constraint.

      • This action restarts your cluster and will cause a service interruption. Perform this operation during off-peak hours or within a maintenance window.

    2. Install the extension in your database.

      -- Install the extensionCREATE EXTENSION IF NOT EXISTS polar_constraint;
      CREATE EXTENSION IF NOT EXISTS polar_constraint;

Syntax

Add a constraint

  • Disabled state: When you add a constraint in the disabled state, the database scans all existing data to ensure that it meets the constraint rule.

    ALTER TABLE table_name ADD CONSTRAINT constraint_name DISABLE;
    Note

    If you add a CHECK (column_name > 0) constraint to a table and the table contains data that violates this constraint, the operation fails with an error.

  • Enabled state

    • When you add a constraint, the database scans all existing data in the table to ensure that it complies with the constraint rule.

      ALTER TABLE table_name ADD CONSTRAINT constraint_name;
      Note

      If you add a CHECK (column_name > 0) constraint to a table containing data that violates this constraint, the operation fails and an error is returned.

    • When you add the constraint, the database validates only new data and skips the check on existing data.

      ALTER TABLE table_name ADD CONSTRAINT constraint_name NOT VALID;
      Note

      If you add a CHECK (column_name > 0) constraint to a table and the table contains data that violates this constraint, the operation is successful. You can scan and validate the existing data later during off-peak hours.

Enable a constraint

  • (Default) When you enable the constraint, the database validates new data and scans all existing data in the table to ensure that it meets the constraint rule.

    ALTER TABLE table_name ENABLE CONSTRAINT constraint_name;
  • When you enable the constraint, the database validates only new data and skips the check on existing data.

    ALTER TABLE table_name ENABLE CONSTRAINT constraint_name NOT VALID;

Disable a constraint

ALTER TABLE table_name DISABLE CONSTRAINT constraint_name;

Drop a constraint

ALTER TABLE table_name DROP CONSTRAINT constraint_name;

Best practices: Choosing the right operations

Each command locks your database table in a different way, which affects how it performs. To use this feature safely, it's important to understand these differences.

Operation

Lock level

Read impact

Write impact

DDL impact

Full table scan

Risks and recommendations

ADD CONSTRAINT ...

(Adds an enabled constraint and immediately validates existing data)

ACCESS EXCLUSIVE

High (blocks all read operations)

High (blocks all write operations)

High (blocks other DDL operations)

Yes

  • Risk: Running this on a large table locks the entire table for a long time, causing a service interruption.

  • Recommendation: Do not use this operation directly on large tables in a production environment. Use the two-phase method instead.

ADD CONSTRAINT ... NOT VALID

(Adds an enabled constraint but does not validate existing data)

ACCESS EXCLUSIVE

(Brief)

Low (briefly blocks only during metadata modification)

Low (briefly blocks only during metadata modification)

Low (briefly blocks)

No

  • Risk: Existing data may not meet the constraint.

  • Recommendation: Recommended as the first step for adding a constraint to a large table. The operation is fast and has almost no impact on your business. Scan and validate the existing data later during off-peak hours.

ADD CONSTRAINT ... DISABLE

(Adds a disabled constraint and immediately validates existing data)

SHARE UPDATE EXCLUSIVE

None (does not block SELECT)

High (blocks INSERT/UPDATE/DELETE)

High (blocks other DDL operations)

Yes

  • Risks:

    • High operational overhead: The operation performs a full table scan and blocks write operations for a long time. This impacts your business.

    • Misleading behavior: The operation validates historical data, but the constraint is not enabled. New data written to the table is not checked. This can immediately create dirty data.

  • Recommendation: This is a highly specific operation. Use it to perform a one-time check to see if table data meets a rule, without immediately enforcing that rule. Because this operation blocks write operations, run it during off-peak hours.

ENABLE CONSTRAINT

(Enables a constraint and validates existing data)

SHARE UPDATE EXCLUSIVE

None (does not block SELECT)

High (blocks INSERT/UPDATE/DELETE)

High (blocks other DDL operations)

Yes

  • Risk: The operation fails if non-compliant existing data exists. The full table scan consumes resources and blocks writes.

  • Recommendation: Run this operation during off-peak hours. Manually check the data before you enable the constraint.

ENABLE CONSTRAINT ... NOT VALID

(Enables a constraint but does not validate existing data)

ACCESS EXCLUSIVE

(Brief)

Low

Low

Low

No

  • Risk: Inconsistencies in existing data are ignored. Only new and modified data is validated.

  • Recommendation: Suitable for scenarios where you want to immediately enforce the constraint on new data but cannot or do not need to clean up historical data at the moment. This is equivalent to ENABLE NOVALIDATE in Oracle.

DISABLE CONSTRAINT

(Disables a constraint)

ACCESS EXCLUSIVE

(Brief)

Low

Low

Low

No

  • Risk: Data integrity risk. Non-compliant dirty data can be written while the constraint is disabled.

  • Recommendation: Use only for controlled, temporary scenarios, such as batch data loading. Re-enable the constraint as soon as possible after the operation.

DROP CONSTRAINT

(Drops a constraint)

ACCESS EXCLUSIVE

(Brief)

Low

Low

Low

No

  • Risk: This is a permanent operation. The constraint definition is lost. Dropping a PRIMARY KEY or UNIQUE constraint also drops its index.

  • Recommendation: Use with caution. Make sure the business rule is no longer needed. To temporarily disable checks, use DISABLE instead.

Use case: Safely add constraints to large tables

Directly adding a constraint to a large table using ADD CONSTRAINT triggers an immediate full table scan and holds an ACCESS EXCLUSIVE lock. This process blocks all read and write operations for a long time, which can severely impact services in a production environment. To minimize the impact, use the following two-phase method:

Example use case

Add a CHECK constraint to a products table with hundreds of millions of rows to ensure that the price column is greater than 0.

Procedure overview

  1. Add an enabled constraint without validating existing data: Quickly add the constraint definition without validating existing data. The lock duration is short.

  2. Validate existing data in the background: During off-peak hours, scan and validate the existing data to ensure that the columns in the table meet the constraint. This process allows concurrent reads and has minimal impact on your services.

Procedure

  1. Quickly add an enabled constraint without validating existing data. To do this, use the NOT VALID option in your ALTER TABLE command. This approach only modifies metadata, making the operation very fast.

    -- Create a test table and data
    CREATE TABLE products (
        id SERIAL PRIMARY KEY,
        name TEXT,
        price NUMERIC
    );
    
    -- Insert a non-compliant row
    INSERT INTO products (name, price) VALUES ('Bag', -1); 
    
    -- Insert a compliant row
    INSERT INTO products (name, price) VALUES ('Book', 10); 
    
    -- Add an enabled constraint without validating existing data
    ALTER TABLE products ADD CONSTRAINT chk_price_positive CHECK (price > 0) NOT VALID;
  2. (Optional) Verify that the constraint is enforced on new data.

    -- New illegal data is blocked
    INSERT INTO products (name, price) VALUES ('Pen', -5);
    -- Error:
    -- ERROR: new row for relation "products" violates check constraint "chk_price_positive"
    -- DETAIL: Failing row contains (3, Pen, -5).
    
    -- Updated illegal data is blocked
    UPDATE products SET price = -2 WHERE id = 1;
    -- Error:
    -- ERROR:  new row for relation "products" violates check constraint "chk_price_positive"
    -- DETAIL:  Failing row contains (1, Bag, -2).
  3. Validate that the historical data meets the constraint. Use SELECT to find non-compliant data and UPDATE to fix it.

    -- Find non-compliant historical data
    SELECT * FROM products WHERE NOT (price > 0);
    
    -- Fix non-compliant historical data
    UPDATE products SET price = 10 WHERE id = 1;

Use case: Optimize bulk data imports

When importing large datasets with commands like COPY, enabled constraints can create a performance bottleneck by validating every row. To dramatically increase import speed, use the following workflow:

  1. Disable the table's constraints before the import.

  2. Perform the bulk data import.

  3. Re-enable the constraints after the import is finished.

Example use case

Import a large CSV file into target_table.

Procedure

  1. Temporarily disable all constraints before the data import.

    -- Assume that target_table already has unique key, foreign key, and check constraints
    ALTER TABLE target_table DISABLE CONSTRAINT uq_target;
    ALTER TABLE target_table DISABLE CONSTRAINT fk_target;
    ALTER TABLE target_table DISABLE CONSTRAINT chk_target;
    Note

    It is generally not recommended or possible to disable a primary key. Focus on disabling UNIQUEFOREIGN KEY, and CHECK constraints.

  2. Perform the data import.

    With constraint checks disabled, the import will be significantly faster.

    COPY target_table FROM '/path/to/data.csv' WITH CSV;
  3. Re-enable the constraints and validate all data.

    Choose one of the following methods based on your data confidence.

    • Option A: Enable without validation (Fastest)
      Use this if you are confident the imported data is clean or if you plan to validate it later.

      Note

      ENABLE CONSTRAINT ... NOT VALID skips validation of the newly imported data and only enforces the constraint on future changes.

      -- Enable the unique constraint
      ALTER TABLE target_table ENABLE CONSTRAINT uq_target NOT VALID;
      
      -- Enable the foreign key constraint
      ALTER TABLE target_table ENABLE CONSTRAINT fk_target NOT VALID;
      
      -- Enable the CHECK constraint
      ALTER TABLE target_table ENABLE CONSTRAINT chk_target NOT VALID;
    • Option B: Enable with validation (Slower, Safer)
      Use this to ensure all data (including the imported data) is valid. Run this during off-peak hours, as it performs a full table scan and can block writes.

      -- Enable the unique constraint. This triggers a validation of all data in the table.
      ALTER TABLE target_table ENABLE CONSTRAINT uq_target;
      -- If the validation fails, an error is returned: ERROR: could not enable unique constraint "uq_target"
      
      -- Enable the foreign key constraint. This triggers a validation of all data in the table.
      ALTER TABLE target_table ENABLE CONSTRAINT fk_target;
      -- If the validation fails, an error is returned: ERROR: insert or update on table "target_table" violates foreign key constraint "fk_target"
      
      -- Enable the CHECK constraint. This triggers a validation of all data in the table.
      ALTER TABLE target_table ENABLE CONSTRAINT chk_target;
      -- If the validation fails, an error is returned: ERROR: check constraint "chk_target" is violated by some row

FAQ

When I run ENABLE CONSTRAINT, the error ERROR: check constraint "..." is violated by some row is returned. What should I do?

This error means existing data in the table violates the constraint. You must find and fix this data before you can enable the constraint.

Example

For the constraint CHECK (price > 0):

  1. Find the non-compliant data.

     SELECT * FROM your_table WHERE NOT (price > 0);
  2. Fix or delete the data.

    UPDATE your_table SET price = ... WHERE ...;
    DELETE FROM your_table WHERE NOT (price > 0);
  3. Retry enabling the constraint.

    ALTER TABLE your_table ENABLE CONSTRAINT your_constraint_name;

When I enable a unique constraint, the error ERROR: could not enable unique constraint "..." is returned. Why?

This error indicates that there are duplicate values in the columns that are used to enforce uniqueness.

Example

  1. Find the duplicate values.

    SELECT column_list, COUNT(*) FROM your_table GROUP BY column_list HAVING COUNT(*) > 1;
  2. Handle the duplicate data as needed. For example, delete the duplicate rows and keep only one.

    DELETE FROM your_table WHERE id NOT IN (SELECT MIN(id) FROM your_table GROUP BY column_list);
  3. Try to enable the constraint again.

    ALTER TABLE your_table ENABLE CONSTRAINT your_unique_name;