All Products
Search
Document Center

Hologres:Guide to Using Serverless Instances

Last Updated:Feb 04, 2026

This topic describes how to purchase and use Hologres Serverless instances.

Limits

  • Hologres Serverless instances are available in the following regions and zones:

    Region

    Zone

    China (Hangzhou)

    Zone J, K

    China (Shanghai)

    Zone E, L

    China (Shenzhen)

    Zone F, D

    China (Beijing)

    Zone I, L

    China (Hong Kong)

    Zone B, D

    Singapore

    Zone A, C

  • Each Alibaba Cloud account can create only one Serverless instance per region.

Function Introduction

Serverless instances provide the following features, which differ from those of compute group and general-purpose instances:

  • The maximum available compute resources (quota) for each Serverless instance is 512 CU. You can still set resource limits at the SQL level. For more information, see Serverless Computing.

  • All read and write requests in a Serverless instance use Serverless compute resources. You cannot manually set the hg_computing_resource parameter.

  • Serverless instances do not support Query Queue. All requests follow the resource request rules and queuing rules of Serverless Computing.

  • A Serverless instance supports up to 256 connections. For more information, see Connection Management.

  • The default number of shards for a Serverless instance is 16. The total number of shards across all table groups in an instance cannot exceed 128. Avoid setting a high shard count. For more information, see Shard Management.

Serverless instances do not support the following features:

The following describes the integration of Serverless instances with other services:

  • Flink: Serverless instances cannot be used as Flink source tables. They can serve as Flink dimension tables and sink tables. However, because Fixed Plan is not supported, queries automatically use the HQE execution engine, which may reduce performance. We recommend that you perform thorough testing before you use this feature.

  • DataWorks Data Integration: Because Fixed Plan is not supported, DataWorks data import jobs automatically use the HQE execution engine, which may reduce performance. We recommend that you perform thorough testing before you use this feature.

  • PAI-Rec: Because vector computing and Fixed Plan are not supported, you must use a compute group instance with the PAI-Rec recommendation system.

Create a Serverless Instance

  1. Go to the or purchase page. Alternatively, log on to the Hologres console. In the navigation pane on the left, click Instances and then click Create Instance.

  2. On the purchase page, configure the following parameters:

    Configuration Item

    Value

    Product Type

    Dedicated Instance (Pay-as-you-go)

    Region

    Your target region (must be a region that supports Serverless instances).

    Instance Type

    Serverless

    Zone

    Your target zone.

    Virtual Private Cloud (VPC)

    Select your VPC and vSwitch. For more information, see VPC and vSwitch.

    vSwitch

    Instance Name

    Enter a custom name.

    Service-linked Role

    Created. If you are purchasing a Hologres instance for the first time, click Create Service-linked Role at the bottom of the purchase page.

    Resource Group

    Default Resource Group. You can also select another resource group.

  3. After you complete the configuration, click Buy Now.

  4. On the Confirm Order page, click Activate Now.

    The Serverless instance is created within a few minutes.

Serverless Instance O&M

Auto Shutdown

If a Serverless instance receives no data writes or queries for 30 consecutive days, it is automatically shut down on the 31st day. To check the remaining time before the instance is automatically shut down, perform the following steps:

  1. Log on to the Hologres console.

  2. In the navigation pane on the left, click Instances.

  3. Click the target Serverless instance. On the Instance Details page, view the information in the Basic Information section.

When an instance is shut down, its stored data is retained and continues to incur pay-as-you-go storage charges. Compute services are paused, and no charges are incurred for them. To resume compute services, go to the Instances page and click Restore in the Status column of the instance.

Instance Version Upgrade

To improve stability and compatibility with Serverless Computing resource pools, Hologres may automatically upgrade your Serverless instance to the latest stable version during your maintenance window. In addition to automatic upgrades, you can manually upgrade to the latest stable version.

All Serverless instance upgrades are hot upgrades. For more information about the impact, see Upgrade Methods.

Monitoring Metrics

Serverless instances support most monitoring metrics. For more information, see Monitoring Metrics in the Hologres Console. However, metrics for unsupported features, such as Fixed Plan, Infrequent Access storage, and Binlog consumption, are unavailable.

You can view monitoring metrics for Serverless instances and configure monitoring alerts as needed.

Use a Serverless Instance

The procedure for using a Serverless instance is the same as that for a compute group or general-purpose instance. You can connect to the instance and run queries as you normally would.

The following example uses public GitHub event data to demonstrate how to use a Serverless instance. Perform the following steps:

  1. Create a Serverless instance.

  2. Create a database.

  3. Import a public dataset with one click or manually create an SQL query to create a MaxCompute foreign table and import data. Run the following statements:

    -- Create a schema for the foreign table
    CREATE SCHEMA IF NOT EXISTS hologres_foreign_dataset_github_event;
    
    -- Create a schema for the internal table and import data
    CREATE SCHEMA IF NOT EXISTS hologres_dataset_github_event;
    
    -- Create a foreign table
    DROP FOREIGN TABLE IF EXISTS hologres_foreign_dataset_github_event.dwd_github_events_odps;
    
    IMPORT FOREIGN SCHEMA "bigdata_public_dataset#github_events" LIMIT TO
    (
        dwd_github_events_odps
    ) 
    FROM SERVER odps_server INTO hologres_foreign_dataset_github_event OPTIONS(if_table_exist 'error',if_unsupported_type 'error');
    
    -- Create an internal table
    DROP TABLE IF EXISTS hologres_dataset_github_event.hologres_github_event;
    BEGIN;
    CREATE TABLE hologres_dataset_github_event.hologres_github_event (
        id BIGINT,
        actor_id BIGINT,
        actor_login TEXT,
        repo_id BIGINT,
        repo_name TEXT,
        org_id BIGINT,
        org_login TEXT,
        type TEXT,
        created_at TIMESTAMP WITH TIME ZONE NOT NULL,
        action TEXT,
        iss_or_pr_id BIGINT,
        number BIGINT,
        comment_id BIGINT,
        commit_id TEXT,
        member_id BIGINT,
        rev_or_push_or_rel_id BIGINT,
        ref TEXT,
        ref_type TEXT,
        state TEXT,
        author_association TEXT,
        language TEXT,
        merged boolean,
        merged_at TIMESTAMP WITH TIME ZONE,
        additions BIGINT,
        deletions BIGINT,
        changed_files BIGINT,
        push_size BIGINT,
        push_distinct_size BIGINT,
        hr TEXT,
        month TEXT,
        year TEXT,
        ds TEXT
    );
    CALL set_table_property('hologres_dataset_github_event.hologres_github_event', 'distribution_key', 'id');
    CALL set_table_property('hologres_dataset_github_event.hologres_github_event', 'event_time_column', 'created_at');
    CALL set_table_property('hologres_dataset_github_event.hologres_github_event', 'clustering_key', 'created_at');
    
    
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.id IS 'Event ID';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.actor_id IS 'Actor ID';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.actor_login IS 'Actor login name';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.repo_id IS 'Repository ID';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.repo_name IS 'Repository name';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.org_id IS 'Organization ID';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.org_login IS 'Organization name';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.type IS 'Event type';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.created_at IS 'Event occurrence time';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.action IS 'Event behavior';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.iss_or_pr_id IS 'Issue or pull request ID';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.number IS 'Issue or pull request number';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.comment_id IS 'Comment ID';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.commit_id IS 'Commit ID';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.member_id IS 'Member ID';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.rev_or_push_or_rel_id IS 'Review, push, or release ID';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.ref IS 'Name of created or deleted resource';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.ref_type IS 'Type of created or deleted resource';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.state IS 'State of issue, pull request, or pull request review';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.author_association IS 'Relationship between actor and repository';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.language IS 'Programming language';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.merged IS 'Whether the merge was accepted';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.merged_at IS 'Code merge time';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.additions IS 'Lines of code added';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.deletions IS 'Lines of code deleted';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.changed_files IS 'Number of files changed in the pull request';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.push_size IS 'Number of commits';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.push_distinct_size IS 'Number of distinct commits';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.hr IS 'Hour of event occurrence (e.g., hr=00 for 00:23)';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.month IS 'Month of event occurrence (e.g., month=2015-10 for October 2015)';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.year IS 'Year of event occurrence (e.g., year=2015 for 2015)';
    COMMENT ON COLUMN hologres_dataset_github_event.hologres_github_event.ds IS 'Date of event occurrence (ds=yyyy-mm-dd)';
    COMMIT;
    
    -- Import data
    SET hg_experimental_serverless_computing_required_cores = 192;
    INSERT INTO hologres_dataset_github_event.hologres_github_event
    SELECT
        *
    FROM
        hologres_foreign_dataset_github_event.dwd_github_events_odps
    WHERE
        ds BETWEEN (CURRENT_DATE - interval '365 day')::text AND (CURRENT_DATE - interval '1 day')::text;
    RESET hg_computing_resource;
    
    -- Update table statistics
    ANALYZE hologres_dataset_github_event.hologres_github_event;
  4. Query the data.

    For example, to query the most active repositories by event count in the last week, run the following statement:

    SELECT
        repo_name,
        COUNT(*) AS events
    FROM
        hologres_dataset_github_event.hologres_github_event
    WHERE
        ds >= (CURRENT_DATE - INTERVAL '7 day')::text
    GROUP BY
        repo_name
    ORDER BY
        events DESC
    LIMIT 5;