All Products
Search
Document Center

AnalyticDB:Manage built-in datasets

Last Updated:Mar 28, 2026

AnalyticDB for MySQL includes a built-in TPC-H dataset (10 GB) that you can load to test cluster performance without preparing your own data. You can load a built-in dataset automatically when you create a cluster or manually from the console. This topic explains how to load and delete the dataset.

Prerequisites

Before you begin, make sure that:

  • Your cluster is Enterprise Edition, Basic Edition, or Data Lakehouse Edition

  • The user_default resource group has at least 16 ACU (AnalyticDB Compute Unit) of reserved compute resources — loading uses this resource group for data initialization

  • The cluster has at least 24 ACU of reserved storage resources

  • No database named ADB_Internal_TPCH_10GB already exists in the cluster

The built-in dataset is approximately 10 GB. Storage for this dataset is free of charge.

Permissions for standard database accounts

Standard database accounts do not have permissions on the ADB_Internal_TPCH_10GB database and cannot load, use, or delete the dataset. A privileged account must grant access first:

GRANT select ON ADB_Internal_TPCH_10GB.* TO <user_name>;

Replace <user_name> with the standard account name.

Automatically load the built-in dataset

If you did not enable Load Built-in Dataset when you created the cluster, you can follow these steps to load it manually.

When you create an AnalyticDB for MySQL cluster, you can enable Load Built-in Dataset to automatically load the built-in dataset after the cluster is created. For more information, see Create a cluster.

Load the built-in dataset

Loading takes approximately 6–8 minutes.

  1. Log on to the AnalyticDB for MySQL console. In the upper-left corner, select a region. In the left-side navigation pane, click Clusters. Find your cluster and click the cluster ID.

  2. In the left-side navigation pane, choose Job Development > SQL Development.

  3. Click Load Built-in Dataset.

After loading completes, the Load Built-in Dataset button is disabled. The ADB_Internal_TPCH_10GB database and its tables appear on the Databases and Tables tab.

For details about the tables in the built-in dataset, see Create test tables.
Important

Run only SELECT queries against ADB_Internal_TPCH_10GB. If DDL (Data Definition Language) or DML (Data Manipulation Language) operations corrupt the dataset, delete the database and reload the dataset.

Run sample queries

After the dataset is loaded, run the sample TPC-H query scripts to test your cluster. Go to Job Development > SQL Development and open the Scripts tab.

For the full list of query statements, see TPC-H test set.

Note

The built-in dataset is approximately 10 GB, whereas the tables created in the TPC-H test are approximately 1 TB. Test results reflect cluster performance for a 10 GB data volume, not the full TPC-H scale.

Delete the built-in dataset

Deleting individual tables may break associated scripts. To fully remove the dataset, delete all tables first, then drop the database.

  1. Drop each table individually:

    DROP TABLE table_name;

    Repeat for every table in ADB_Internal_TPCH_10GB.

  2. Drop the database:

    DROP DATABASE ADB_Internal_TPCH_10GB;

When the database is deleted, all associated scripts are also removed.

FAQ

What are the minimum resource requirements to load the dataset?

The user_default resource group needs at least 16 ACU of reserved compute resources, and the cluster needs at least 24 ACU of reserved storage resources. If either requirement is not met, loading fails.

How do I confirm that loading succeeded?

Go to Job Development > SQL Development. Loading is complete when the Load Built-in Dataset button is disabled and the ADB_Internal_TPCH_10GB database appears on the Databases and Tables tab. A success icon 1 appears next to the button.

What should I do if loading fails or takes too long?

Drop all tables in ADB_Internal_TPCH_10GB using DROP TABLE table_name;, then drop the database with DROP DATABASE ADB_Internal_TPCH_10GB;. After the database is fully removed, reload the dataset.

References