All Products
Search
Document Center

Realtime Compute for Apache Flink:Management console permissions

Last Updated:Mar 06, 2026

To access the Realtime Compute for Apache Flink console using a Resource Access Management (RAM) user or RAM role and perform operations such as viewing, purchasing, or deleting workspaces, you must have the required permissions. For security purposes, the Alibaba Cloud account administrator who purchased the Flink workspace must attach the appropriate access policies to all principals in the RAM console. This topic describes the supported access policies and specific authorization configurations.

Authorization scenarios

Scenario

Interface

Description

Unable to access the Realtime Compute for Apache Flink console

You cannot view any workspace information, and the following error message appears.

image

This error indicates that you do not have the permissions to access the Realtime Compute for Apache Flink console. Contact the Alibaba Cloud account administrator who purchased the workspace to grant your account at least read-only permissions for Realtime Compute for Apache Flink (AliyunStreamReadOnlyAccess). For more information, see Authorization procedure. After the authorization is complete, re-enter or refresh the page to access the console.

Unable to perform a specific operation

image

This error indicates that the current account does not have the permissions to perform this operation. To perform the operation, contact the Alibaba Cloud account administrator who purchased the workspace to adjust the custom policy based on your requirements and complete the authorization. For more information, see Authorization procedure. For example, as shown in the figure on the left, your account needs to be granted permissions related to resource allocation for subscription workspaces.

Policy types

An access policy is a set of permissions described using policy syntax and structure. You can use an access policy to specify the authorized resource set, operation set, and authorization conditions. The RAM console supports the following types of access policies:

  • System policies: System policies are created by Alibaba Cloud. You can use but cannot modify these policies. Alibaba Cloud maintains the version updates of these policies. The following table describes the system policies that Flink supports.

    Access policy

    Name

    Description

    Full access to Realtime Compute for Apache Flink

    AliyunStreamFullAccess

    Includes all permissions in Custom policies.

    Read-only access to Realtime Compute for Apache Flink

    AliyunStreamReadOnlyAccess

    Includes the HasStreamDefaultRole permission and all permissions that start with Describe, Query, Check, List, Get, and Search in Realtime Compute for Apache Flink permission policies.

    Permissions to view and pay for orders in Fee Hub (BSS)

    AliyunBSSOrderAccess

    The permissions to view and pay for orders in User Center.

    Unsubscribe operation permission in Fee Hub (BSS)

    AliyunBSSRefundAccess

    The permission to unsubscribe from orders in User Center.

  • Custom policies: You can create, update, and delete custom policies. You are responsible for maintaining the version updates of your custom policies. For more information about the custom policies that Flink supports and how to create a custom policy, see Realtime Compute for Apache Flink permission policies and (Optional) Step 1: Create a custom policy.

Prerequisites

You are familiar with the authorization instructions.

Authorization procedure

(Optional) Step 1: Create a custom policy

If you want to use the AliyunStreamFullAccess system policy, you can skip this step.

When you create a custom policy, we recommend that you use the read-only permissions for Realtime Compute for Apache Flink as a basis and then add more fine-grained access control points as needed. These access control points include the custom policies and permission operations for related products that Realtime Compute for Apache Flink supports. The following code provides the details of a custom policy that grants read-only permissions for Realtime Compute for Apache Flink. The permission scope is the same as that of the AliyunStreamReadOnlyAccess system policy.

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:Describe*",
        "stream:Query*",
        "stream:Check*",
        "stream:List*",
        "stream:Get*",
        "stream:Search*",
        "stream:HasStreamDefaultRole"
      ],
       "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#instanceId}/vvpnamespace/{#namespace}",
      "Effect": "Allow"
    }
  ]
}
  • For more information about how to create a custom policy and for examples, see Create a custom policy and Custom policy examples.

  • In an access policy, Action defines the operation, Resource defines the target object, and Effect defines whether the operation is allowed or denied. For more information about the syntax and structure of access policies, see Policy elements and Policy structure and syntax. Replace the parameters in the policy with your actual values. The following table describes these parameters.

    • {#regionId}: The region where the destination Flink workspace resides.

    • {#accountId}: The UID of the Alibaba Cloud account.

    • {#instanceId}: The ID of the destination Realtime Compute for Apache Flink order instance.

    • {#namespace}: The name of the destination project.

Step 2: Attach the target policy to a member

You can attach an access policy to a RAM user or RAM role to grant the permissions specified in the policy. This section describes how to grant permissions to a RAM user. The procedure for granting permissions to a RAM role is similar. For more information, see Manage RAM roles.

  1. Log on to the RAM console as a RAM administrator.

  2. In the left-side navigation pane, choose Identities > Users.

  3. On the Users page, find the required RAM user, and click Add Permissions in the Actions column.

    image

    You can also select multiple RAM users and click Add Permissions in the lower part of the page to grant permissions to the RAM users at a time.

  4. In the Add Permissions panel, add permissions to the RAM user.

    image

    Parameter

    Description

    Scope

    Select the required application scope:

    • Alibaba Cloud Account: The permissions take effect within the current Alibaba Cloud account.

    • Specific Resource Group: The permissions take effect within the specified resource group.

    Principal

    The principal to which you want to grant permissions. The value is the RAM user that you want to authorize. The system automatically specifies the current RAM user. You can also add other RAM users.

    Access Policy

    Select a system policy or a custom policy that you have created.

  5. Click the Confirm New Authorization button.

  6. Click Close.

Step 3: Log on to the console after authorization

After the authorization is complete, the RAM user or RAM role can log on to the Realtime Compute for Apache Flink console or refresh the current page to perform the related operations.

Logon type

Logon method

How to log on

Alibaba Cloud RAM user

Log on as a RAM user

Log in to the Alibaba Cloud Management Console as a RAM user

Alibaba Cloud RAM role

A RAM user of Alibaba Cloud account A assumes a role of account A to log on

Assuming a RAM role

A RAM user of Alibaba Cloud account B assumes a role of account A to log on

Accessing resources across Alibaba Cloud accounts

Resource directory member

A RAM user of the management account assumes a RAM role of a member to log on

For more information, see Log on to the Alibaba Cloud console by assuming a RAM role.

Log on as a RAM user of a member

Sign in to the Alibaba Cloud Management Console as a RAM user

Log on as an Alibaba Cloud account (root user) (not recommended)

Log in to the Alibaba Cloud Management Console as the root user (not recommended)

A CloudSSO user logs on by assuming a RAM role

Use CloudSSO to centrally manage identities and permissions across multiple enterprise accounts

A CloudSSO user logs on as a RAM user

Custom policy examples

Example 1: A RAM user activates a Realtime Compute for Apache Flink workspace

To allow a RAM user to activate a subscription Realtime Compute for Apache Flink workspace that uses fully managed storage and free monitoring, you can create and grant a custom policy that includes the following permissions:

  • Read-only permissions for Realtime Compute for Apache Flink: stream:Describe*, stream:Query*, stream:Check*, stream:List*, stream:Get*, stream:Search*, and stream:HasStreamDefaultRole.

  • Permission to purchase a Realtime Compute for Apache Flink workspace: stream:CreateVvpInstance

  • Permission to allow Flink to query existing VPCs: vpc:DescribeVpcs

  • Permission to allow Flink to query existing vSwitches: vpc:DescribeVSwitches

  • Permissions to view and pay for orders in User Center: bss:DescribeOrderList, bss:DescribeOrderDetail, bss:PayOrder, and bss:CancelOrder

The full custom policy is as follows.

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:Describe*",
        "stream:Query*",
        "stream:Check*",
        "stream:List*",
        "stream:Get*",
        "stream:Search*",
        "stream:HasStreamDefaultRole",
        "stream:CreateVvpInstance",
        "vpc:DescribeVpcs",
        "vpc:DescribeVSwitches",
        "bss:DescribeOrderList",
        "bss:DescribeOrderDetail",
        "bss:PayOrder",
        "bss:CancelOrder"
      ],
      "Resource": "*",
      "Effect": "Allow"
    }
  ]
}

Example 2: A RAM user activates a Realtime Compute for Apache Flink workspace (with an existing system policy)

If a RAM user already has the AliyunStreamFullAccess system policy, you must also create and grant a custom policy to the RAM user to allow the user to activate a subscription Realtime Compute for Apache Flink workspace that uses fully managed storage and free monitoring. The custom policy must include the following permissions:

  • Permission to allow Flink to query existing VPCs: vpc:DescribeVpcs

  • Permission to allow Flink to query existing vSwitches: vpc:DescribeVSwitches

  • Permissions to view and pay for orders in User Center: bss:DescribeOrderList, bss:DescribeOrderDetail, bss:PayOrder, and bss:CancelOrder

The following code provides the complete custom policy.

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "vpc:DescribeVpcs",
        "vpc:DescribeVSwitches",
        "bss:DescribeOrderList",
        "bss:DescribeOrderDetail",
        "bss:PayOrder",
        "bss:CancelOrder"
      ],
      "Resource": "*",
      "Effect": "Allow"
    }
  ]
}

Example 3: A RAM user releases a subscription Flink workspace

To allow a RAM user to release a subscription Flink workspace, you can create and grant a custom policy that includes the following permissions:

  • Read-only permissions for Realtime Compute for Apache Flink: stream:Describe*, stream:Query*, stream:Check*, stream:List*, stream:Get*, stream:Search*, and stream:HasStreamDefaultRole.

  • Permission to unsubscribe from orders in User Center: bss:Describe* and bss:Refund*

The following code provides the complete custom policy.

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:Describe*",
        "stream:Query*",
        "stream:Check*",
        "stream:List*",
        "stream:Get*",
        "stream:Search*",
        "stream:HasStreamDefaultRole",
        "bss:Describe*",
        "bss:Refund*"
      ],
      "Resource": "*",
      "Effect": "Allow"
    }
  ]
}

Example 4: A RAM user releases a pay-as-you-go Flink workspace

To allow a RAM user to release a pay-as-you-go Flink workspace, you can create and grant a custom policy that includes the following permissions:

  • Read-only permissions for Realtime Compute for Apache Flink: stream:Describe*, stream:Query*, stream:Check*, stream:List*, stream:Get*, stream:Search*, and stream:HasStreamDefaultRole.

  • Permission to release a Flink workspace: stream:DeleteVvpInstance

The following code provides the complete custom policy.

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:Describe*",
        "stream:Query*",
        "stream:Check*",
        "stream:List*",
        "stream:Get*",
        "stream:Search*",
        "stream:HasStreamDefaultRole",
        "stream:DeleteVvpInstance"
      ],
      "Resource": "*",
      "Effect": "Allow"
    }
  ]
}

Example 5: A RAM user allocates resources for a project

To use a RAM user to release a subscription Realtime Compute for Apache Flink instance, you must create and grant a custom policy. This custom policy must include the following permissions:

  • Read-only permissions for Realtime Compute for Apache Flink: stream:Describe*, stream:Query*, stream:Check*, stream:List*, stream:Get*, stream:Search*, and stream:HasStreamDefaultRole.

  • Permission to change the resources of a subscription project: ModifyVvpNamespaceSpec.

The following code provides the complete custom policy.

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:Describe*",
        "stream:Query*",
        "stream:Check*",
        "stream:List*",
        "stream:Get*",
        "stream:Search*",
        "stream:HasStreamDefaultRole",
        "stream:ModifyVvpNamespaceSpec"
      ],
      "Resource": "*",
      "Effect": "Allow"
    }
  ]
}

Custom policies

Realtime Compute for Apache Flink permission policies

Important

Before you configure permissions for a project, you must configure the permission to view existing workspaces (DescribeVvpInstances). Otherwise, a permission error occurs.

Flink workspaces

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:CreateVvpInstance",
        "stream:DescribeVvpInstances",
        "stream:DeleteVvpInstance",
        "stream:RenewVvpInstance",
        "stream:ModifyVvpPrepayInstanceSpec",
        "stream:ModifyVvpInstanceSpec",
        "stream:ConvertVvpInstance",
        "stream:QueryCreateVvpInstance",
        "stream:QueryRenewVvpInstance",
        "stream:QueryModifyVvpPrepayInstanceSpec",
        "stream:QueryConvertVvpInstance"
      ],
      "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#InstanceId}",
      "Effect": "Allow"
    }
  ]
}

action

Description

CreateVvpInstance

Purchase a Realtime Compute for Apache Flink workspace.

DescribeVvpInstances

View workspaces.

DeleteVvpInstance

Release a Flink workspace.

RenewVvpInstance

Renew a subscription workspace.

ModifyVvpPrepayInstanceSpec

Scale a subscription workspace.

ModifyVvpInstanceSpec

Adjust the quota of a pay-as-you-go workspace.

ConvertVvpInstance

Change the billing method of a workspace.

QueryCreateVvpInstance

Query the price for creating a workspace.

QueryRenewVvpInstance

Query the price for renewing a workspace.

QueryModifyVvpPrepayInstanceSpec

Query the price for scaling a workspace.

QueryConvertVvpInstance

Query the price for changing the billing method from pay-as-you-go to subscription.

Note

When purchasing a Realtime Compute for Apache Flink workspace and viewing workspaces, you can change Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#instanceId} to "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/*".

Flink projects

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:CreateVvpNamespace",
        "stream:DeleteVvpNamespace",
        "stream:ModifyVvpPrepayNamespaceSpec",
        "stream:ModifyVvpNamespaceSpec",
        "stream:DescribeVvpNamespaces"
      ],
       "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#instanceId}/vvpnamespace/{#namespace}",
      "Effect": "Allow"
    }
  ]
}

action

Description

CreateVvpNamespace

Create a project.

DeleteVvpNamespace

Delete a project.

ModifyVvpPrepayNamespaceSpec

Change the resources of a subscription project.

ModifyVvpNamespaceSpec

Change the resources of a pay-as-you-go project.

DescribeVvpNamespaces

View a list of projects.

After you configure this policy, you can click the image.png icon to the left of a destination workspace ID to view the list of projects that are created in the workspace. If you want to go to the development console of a destination project, you must be granted the permissions to develop jobs in the project. For more information, see Development console authorization.

Note

When creating a project and viewing a list of projects, you can change "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#instanceId}/vvpnamespace/{#namespace}", to "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#instanceId}/vvpnamespace/*",.

Permission operations for related products

ECS-related permission operations

To access the development console from the internet, you must activate Elastic IP Address (EIP). To connect to resources in a Virtual Private Cloud (VPC), you must create elastic network interfaces (ENIs) in the VPC. These ENIs are added to a dedicated security group for serverless Flink. In this case, Flink requires permissions to operate the EIP, security group, and ENIs.

Operation (Action)

Description

ecs:AssociateEipAddress

Request an EIP address that can be used to access the Flink service over the public network.

ecs:AttachNetworkInterface

Allow the Flink service to attach your ENI to a Flink resource pool.

ecs:AuthorizeSecurityGroup

The Flink product creates a new security group. This permission is used to add an inbound rule to the security group.

ecs:AuthorizeSecurityGroupEgress

The Flink product creates a new security group. This permission is used to add an outbound rule to the security group.

ecs:CreateNetworkInterface

Allow the Flink service to create an ENI in your VPC to support the connection from the Flink service to your VPC.

ecs:CreateNetworkInterfacePermission

Allow the Flink service to grant permissions on ENIs.

ecs:CreateSecurityGroup

The Flink product creates a new security group. This permission is used to create the security group.

ecs:DeleteNetworkInterface

Delete the ENIs of related resources after a Flink task is complete.

ecs:DeleteNetworkInterfacePermission

Allow the Flink service to have the permissions to detach your ENI.

ecs:DeleteSecurityGroup

The Flink product creates a new security group. This permission is used to delete the security group.

ecs:DescribeNetworkInterfacePermissions

Allow your ENI to be detached from a serverless Flink resource pool.

ecs:DescribeNetworkInterfaces

Allow the Flink service to query ENIs.

ecs:DescribeSecurityGroupAttribute

Allow the Flink service to query the rules of a security group.

ecs:DescribeSecurityGroupReferences

Allow the Flink service to query security groups and security group-level authorization behaviors.

ecs:DescribeSecurityGroups

Allow the Flink service to query the basic information about created security groups.

ecs:DetachNetworkInterface

Allow the Flink service to detach your ENI from a Flink resource pool.

ecs:JoinSecurityGroup

Allow the Flink service to add an ENI to a specified security group.

ecs:LeaveSecurityGroup

Allow the Flink service to remove an ENI from a specified security group.

ecs:ModifyNetworkInterfaceAttribute

Allow the Flink service to modify the name, description, and security group of an ENI.

ecs:ModifySecurityGroupAttribute

Allow the Flink service to modify the name or description of a security group.

ecs:ModifySecurityGroupPolicy

Allow the Flink service to modify the connectivity policy in a security group.

ecs:ModifySecurityGroupRule

Allow the Flink service to modify the description of an inbound security group rule.

ecs:RevokeSecurityGroup

Allow Flink to delete an inbound security group rule.

ecs:RevokeSecurityGroupEgress

Allow Flink to delete an outbound security group rule.

ecs:UnassociateEipAddress

Allow the Flink service to release an EIP.

OSS-related permission operations

To view the list of OSS buckets, you must be granted the required OSS-related permissions.

Operation (Action)

Description

oss:ListBuckets

Allow the Flink service to view the list of OSS buckets.

oss:GetBucketInfo

Obtain information about a bucket.

oss:GetObjectMetadata

Obtain the metadata of a file.

oss:GetObject

Obtain a file.

oss:ListObjects

List information about all objects in a bucket.

oss:PutObject

Upload a file.

oss:CopyObject

You can copy files (objects) within and between buckets in the same region.

oss:CompleteMultipartUpload

Complete the multipart upload of a file after all data parts are uploaded.

oss:AbortMultipartUpload

Cancel a multipart upload event and delete the corresponding part data.

oss:InitiateMultipartUpload

Notify OSS to initialize a multipart upload event before data is transmitted in multipart upload mode.

oss:UploadPartCopy

Copy data from an existing object to upload a part.

oss:UploadPart

Upload data by part based on the specified object name and upload ID.

oss:DeleteObject

Delete a file (object).

oss:PutBucketcors

Set cross-origin resource sharing (CORS) rules for a specified bucket.

oss:GetBucketCors

Obtain the current CORS rules of a specified bucket.

oss:PutBucket

Create a bucket.

Note

If you use the Key Management Service (KMS) encryption feature of OSS, you must add KMS-related access policies to the AliyunStreamAsiDefaultRole role to ensure that the feature can be used as expected. For more information about the policies, see Upload a file to a bucket for which default encryption is configured.

ARMS-related permission operations

Flink metrics are stored in Application Real-Time Monitoring Service (ARMS). Therefore, the ARMS service is activated for you.

Operation (Action)

Description

arms:ListDashboards

View ARMS dashboard information.

arms:CreateContact

Create a contact.

arms:DeleteContact

Delete a contact.

arms:SearchContact

Search for a contact.

arms:UpdateContact

Update a contact.

arms:CreateContactGroup

Create a contact group.

arms:DeleteContactGroup

Delete a contact group.

arms:SearchContactGroup

Search for a contact group.

arms:UpdateContactGroup

Update a contact group.

arms:SearchAlertRules

Search for alert rules.

arms:CreateAlertRules

Create alert rules.

arms:UpdateAlertRules

Update alert rules.

arms:DeleteAlertRules

Delete alert rules.

arms:StartAlertRule

Start an alert rule.

arms:StopAlertRule

Pause an alert rule.

arms:SearchAlarmHistories

View historical alert information.

arms:OpenArmsService

Activate the ARMS service.

arms:CreateWebhook

Create a webhook.

arms:UpdateWebhook

Update a webhook.

arms:CreateDispatchRule

Create a scheduling rule.

arms:ListDispatchRule

You can view the list of dispatch rules.

arms:DeleteDispatchRule

Delete a scheduling rule.

arms:UpdateDispatchRule

Update a scheduling rule.

arms:DescribeDispatchRule

View the details of a scheduling rule.

arms:GetAlarmHistories

Get alert sending history.

arms:SaveAlert

Save an alert rule.

arms:DeleteAlert

Delete an alert rule.

arms:GetAlert

Get an alert rule.

arms:CheckServiceStatus

Check the service activation status.

arms:InstallManagedPrometheus

Create a managed Prometheus instance.

arms:UninstallManagedPrometheus

Uninstall a managed Prometheus instance.

arms:GetManagedPrometheusStatus

Get the installation status of a managed Prometheus instance.

VPC-related permission operations

During the activation of a Flink workspace, the Describe permission for resources in a VPC is required.

Operation (Action)

Description

vpc:DescribeVpcAttribute

Allow the Flink service to query the configuration information of a specified VPC.

vpc:DescribeVpcs

Allow the Flink service to query created VPCs.

vpc:DescribeVSwitchAttributes

Allow the Flink service to query the information about a specified vSwitch.

vpc:DescribeVSwitches

Allow the Flink service to query created vSwitches.

vpc:DescribeRouteTableList

Allow the Flink service to query a list of route tables.

vpc:DescribeRouteTables

Allow the Flink service to query a specified route table.

vpc:DescribeRouteEntryList

Allow the Flink service to query a list of route entries.

vpc:DescribeRouterInterfaceAttribute

Allow the Flink service to query router interface configurations.

vpc:DescribeRouterInterfaces

Allow the Flink service to query router interfaces.

vpc:DescribeVRouters

Allow the Flink service to query a list of vRouters in a specified region.

vpc:CreateVpc

Create a VPC.

vpc:CreateVSwitch

Create a vSwitch.

RAM-related permission operations

During the activation of a Flink workspace, RAM-related permissions are required for resource configuration.

Operation (Action)

Description

ram:*

The permissions to add, delete, modify, and query the domain and application RAM resources.

TAG-related permission points

Operation (Action)

Description

tag:ListTagResources

Query a list of resource tags.

tag:ListTagKeys

Query a list of tag keys.

tag:ListTagValues

Query the tag values that correspond to a specified tag key.

DLF-related permission operations

During the activation of a Flink workspace, Data Lake Formation (DLF) permissions are required to access DLF-related catalogs.

Operation (Action)

Description

dlf:BatchCreatePartitions

Create multiple partitions at a time.

dlf:BatchCreateTables

Create multiple tables at a time.

dlf:BatchDeletePartitions

Delete multiple partitions at a time.

dlf:BatchDeleteTables

Delete multiple tables at a time.

dlf:BatchGetPartitions

Obtain multiple partitions at a time.

dlf:BatchGetTables

Obtain multiple tables at a time.

dlf:BatchUpdatePartitions

Update multiple partitions at a time.

dlf:BatchUpdateTables

Update multiple tables at a time.

dlf:CreateCatalog

Create a data lake catalog.

dlf:CreateDatabase

Create a database.

dlf:CreateFunction

Create a function.

dlf:CreatePartition

Create a partition.

dlf:CreateTable

Create a table.

dlf:DeleteCatalog

Delete a data lake catalog.

dlf:DeleteDatabase

Delete a database.

dlf:DeleteFunction

Delete a function.

dlf:DeletePartition

Delete a partition.

dlf:DeleteTable

Delete a table.

dlf:GetAsyncTaskStatus

Obtain the status of an asynchronous task.

dlf:GetCatalog

Obtain a data lake catalog.

dlf:GetCatalogByInstanceId

Obtain a catalog by instance ID.

dlf:GetCatalogSettings

Obtain the configurations of a data lake.

dlf:GetDatabase

Obtain a database.

dlf:GetFunction

Obtain a function.

dlf:GetPartition

Obtain a partition.

dlf:GetTable

Obtain a table.

dlf:ListCatalogs

Obtain a list of catalogs.

dlf:ListDatabases

Obtain a list of databases.

dlf:ListFunctionNames

Obtain a list of function names.

dlf:ListFunctions

Obtain a list of functions.

dlf:ListPartitionNames

Obtain a list of partition names.

dlf:ListPartitions

Obtain a list of partitions.

dlf:ListPartitionsByExpr

Obtain a list of partitions by expression.

dlf:ListPartitionsByFilter

Obtain a list of partitions by filter.

dlf:ListTableNames

Obtain a list of table names.

dlf:ListTables

Obtain a list of tables.

dlf:RenamePartition

Rename a partition.

dlf:RenameTable

Rename a table.

dlf:UpdateCatalog

Update a data lake catalog.

dlf:UpdateDatabase

Update a database.

dlf:UpdateFunction

Update a function.

dlf:UpdateTable

Update a table.

dlf:BatchGetPartitionColumnStatistics

Obtain the statistics of metadata partitions at a time.

dlf:DeletePartitionColumnStatistics

Delete the statistics of a metadata table partition.

dlf:DeleteTableColumnStatistics

Delete the statistics of a metadata table.

dlf:GetPartitionColumnStatistics

Obtain the statistics of a metadata partition field.

dlf:GetTableColumnStatistics

Obtain the statistics of a metadata table field.

dlf:UpdateTableColumnStatistics

Update the statistics of a metadata table.

dlf:UpdatePartitionColumnStatistics

Update the statistics of a metadata table partition.

dlf:CreateLock

Create a metadata lock.

dlf:UnLock

Unlock a specified metadata lock.

dlf:AbortLock

Abort a metadata lock.

dlf:RefreshLock

Refresh a metadata lock.

dlf:GetLock

Query a metadata lock.

dlf:GetCatalogAccessInfo

Use a CatalogUuid to obtain backend storage information such as StorageName and StorageEndpoint.

dlf:GetDataToken

Use a UUID to obtain a catalog-level or table-level data key.

dlf:GetDataTokenByName

Use a CatalogUuid, DatabaseName, and TableName to obtain a catalog-level or table-level data key.

dlf-auth:ActOnBehalfOfAnotherUser

Pass through an identity. A service-linked role (SLR) or service role (SR) accesses DLF on behalf of another user.

dlf:GrantPermissions

Grant a principal permissions on resources.

dlf:RevokePermissions

Revoke permissions on resources from a principal.

dlf:BatchGrantPermissions

Grant permissions in a batch.

dlf:BatchRevokePermissions

Revoke permissions in a batch.

dlf:UpdatePermissions

Update the permissions of a principal on resources.

dlf:ListPermissions

Obtain the permission information of a specified resource or principal.

dlf:CreateRole

Create a role.

dlf:UpdateRole

Update a role.

dlf:DeleteRole

Delete a role.

dlf:GetRole

Obtain a role.

dlf:ListRoles

Query a list of roles.

dlf:GrantRolesToUser

Grant multiple role permissions to a specified user at a time.

dlf:RevokeRolesFromUser

Revoke multiple role permissions from a specified user at a time.

dlf:GrantRoleToUsers

Grant a specified role permission to multiple users at a time.

dlf:RevokeRoleFromUsers

Revoke a specified role permission from multiple users at a time.

dlf:UpdateRoleUsers

Update the users in a role.

dlf:ListRoleUsers

Query a list of users in a role.

dlf:ListUserRoles

Query a list of user roles.

dlf:GrantRolesToPrincipal

Grant multiple role permissions to a specified principal at a time.

dlf:RevokeRolesFromPrincipal

Revoke multiple role permissions from a specified principal at a time.

dlf:GrantRoleToPrincipals

Grant a specified role permission to multiple principals at a time.

dlf:RevokeRoleFromPrincipals

Revoke a specified role permission from multiple principals at a time.

dlf:UpdateRolePrincipals

Update the principals in a role.

dlf:BatchDeleteRoles

Delete multiple roles at a time.

dlf:CheckPermissions

Check permissions.

dlf:GetCatalogStorageStatistics

Obtain catalog statistics metrics.

dlf:GetCatalogStorageIndicatorDetails

Obtain catalog metric trends.

dlf:GetCatalogStorageRank

Obtain catalog storage statistics rankings.

dlf:GetCatalogStorageAnalysis

Obtain catalog storage distribution data.

dlf:GetDatabaseProfile

Obtain database data profiles.

dlf:GetDatabaseStorageAnalysis

Obtain database storage distribution data.

dlf:GetTableProfile

Obtain table data profiles.

dlf:GetTableStorageAnalysis

Obtain table storage distribution data.

dlf:ListPartitionProfiles

Obtain a list of partition data profiles.

dlf:getLatestStorageStatisticsDate

Obtain the last update time of storage overview data.

dlf:SubscribeOptimize

Submit for optimization.

dlf:GetOptimizeRegionStatus

Obtain the optimization region status.

dlf:GetOptimizeWorkspaceAuthorization

Obtain the authentication of an optimization workspace.

dlf:AddOptimizeWorkspace

Add an optimization workspace.

dlf:ListOptimizeWorkspaces

Obtain a list of optimization workspaces.

dlf:PreCheckOptimizeWorkspaceConnection

Precheck an optimization workspace connection.

dlf:CheckOptimizeWorkspaceConnection

Check an optimization workspace connection.

dlf:DeleteOptimizeWorkspace

Delete an optimization workspace.

dlf:SetOptimizeEnable

Set the storage optimization switch.

dlf:SetOptimizePolicy

Set a storage optimization policy.

dlf:GetOptimizePolicy

Obtain a storage optimization policy.

dlf:SetOptimizeScheduleRule

Add a storage optimization scheduling rule.

dlf:ListOptimizeScheduleRules

Obtain a list of optimization schedules.

dlf:DeleteOptimizeScheduleRule

Delete a storage optimization scheduling rule.

dlf:RunOptimizeImmediately

Run storage management optimization at once.

dlf:GetOptimizeInfo

Obtain optimization information.

dlf:UpdateOptimizeTaskResult

Update a storage optimization task result.

dlf:BatchDeleteTableVersions

Delete specified versions of a Data Lake table at a time.

dlf:DeleteTableVersion

Delete a specified version of a Data Lake table.

dlf:GetTableVersion

Obtain a specified version of a Data Lake table.

dlf:ListTableVersions

Query a list of specified versions of a Data Lake table by page.

dlf:Search

Retrieve metadata.

dlf:SearchAcrossCatalog

Search for content such as databases, tables, and fields across catalogs.

dlf:GetServiceStatus

Obtain the service activation status of a user for Data Lake Formation.

dlf:GetRegionStatus

Obtain the service activation status of Data Lake Formation in a specified region.

References