All Products
Search
Document Center

Realtime Compute for Apache Flink:Management portal permissions

Last Updated:Mar 26, 2026

To access the Realtime Compute for Apache Flink console and perform operations such as viewing, purchasing, or deleting workspaces, RAM users and RAM roles must have the required permissions. The Alibaba Cloud account administrator who owns the Flink workspace grants access by attaching policies in the RAM console.

Who should read this topic

How you use Resource Access Management (RAM) depends on your role:

Authorization scenarios

Scenario Description
Unable to access the Realtime Compute for Apache Flink console You cannot view any workspace information. This error means you do not have permission to access the Flink console. Contact the Alibaba Cloud account administrator who purchased the workspace and ask them to grant your account at least read-only permissions (AliyunStreamReadOnlyAccess). See Authorization procedure. After authorization, re-enter or refresh the page.
Unable to perform a specific operation This error means the current account lacks permission for the operation. Contact the Alibaba Cloud account administrator to adjust the custom policy based on your requirements. See Authorization procedure. For example, if you need to allocate resources for a subscription workspace, your account must be granted the corresponding resource allocation permission.

Policy types

An access policy defines a set of permissions using policy syntax and structure. It specifies the authorized resources, allowed operations, and authorization conditions. The RAM console supports two types of access policies:

  • System policies: Created and maintained by Alibaba Cloud. You can use but cannot modify these policies. The following table lists the system policies that Flink supports.

    Access policy Name Description
    Full access to Realtime Compute for Apache Flink AliyunStreamFullAccess Includes all permissions in Custom policies.
    Read-only access to Realtime Compute for Apache Flink AliyunStreamReadOnlyAccess Includes the HasStreamDefaultRole permission and all permissions that start with Describe, Query, Check, List, Get, and Search in Realtime Compute for Apache Flink permission policies.
    View and pay for orders in User Center (BSS) AliyunBSSOrderAccess Grants permissions to view and pay for orders in User Center.
    Unsubscribe from orders in User Center (BSS) AliyunBSSRefundAccess Grants permission to unsubscribe from orders in User Center.
  • Custom policies: Created, updated, and deleted by you. For more information about the custom policies that Flink supports and how to create one, see Realtime Compute for Apache Flink permission policies and (Optional) Step 1: Create a custom policy.

Prerequisites

Before you begin, ensure that you have:

Authorization procedure

(Optional) Step 1: Create a custom policy

Skip this step if you plan to use the AliyunStreamFullAccess system policy.

Start from the read-only permission set for Realtime Compute for Apache Flink, then add more fine-grained access control points as needed. These include the custom policies and permission operations for related products that Realtime Compute for Apache Flink supports.

The following policy grants read-only permissions for Realtime Compute for Apache Flink, equivalent to the AliyunStreamReadOnlyAccess system policy.

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:Describe*",
        "stream:Query*",
        "stream:Check*",
        "stream:List*",
        "stream:Get*",
        "stream:Search*",
        "stream:HasStreamDefaultRole"
      ],
      "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#instanceId}/vvpnamespace/{#namespace}",
      "Effect": "Allow"
    }
  ]
}

Replace the placeholders with your actual values:

Placeholder Description
{#regionId} The region where the destination Flink workspace resides
{#accountId} The UID of the Alibaba Cloud account
{#instanceId} The ID of the destination Realtime Compute for Apache Flink order instance
{#namespace} The name of the destination project

In an access policy, Action defines the operation, Resource defines the target object, and Effect defines whether the operation is allowed or denied. For more information about policy syntax, see Policy elements and Policy structure and syntax.

For detailed instructions and examples, see Create a custom policy and Custom policy examples.

Step 2: Attach the policy to a member

Attach an access policy to a RAM user or RAM role to grant the specified permissions. The following steps show how to grant permissions to a RAM user. The procedure for a RAM role is similar. For more information, see Manage RAM roles.

  1. Log on to the RAM console as a RAM administrator.

  2. In the left-side navigation pane, choose Identities > Users.

  3. On the Users page, find the required RAM user, and click Add Permissions in the Actions column. To grant the same permissions to multiple RAM users at once, select them and click Add Permissions at the bottom of the page.

    image

  4. In the Add Permissions panel, configure the following parameters.

    Parameter Description
    Scope Select the application scope: Alibaba Cloud Account (permissions take effect within the current account) or Specific Resource Group (permissions take effect within the specified resource group).
    Principal The RAM user to authorize. The system pre-selects the current RAM user. Add other RAM users as needed.
    Access Policy Select a system policy or a custom policy that you have created.

    image

  5. Click Confirm New Authorization.

  6. Click Close.

Step 3: Log on to the console after authorization

After authorization is complete, the RAM user or RAM role can log on to the Realtime Compute for Apache Flink console or refresh the current page.

Logon type Logon method Reference
Alibaba Cloud RAM user Log on as a RAM user Log in to the Alibaba Cloud Management Console as a RAM user
Alibaba Cloud RAM role A RAM user of account A assumes a role of account A Assuming a RAM role
Alibaba Cloud RAM role A RAM user of account B assumes a role of account A Accessing resources across Alibaba Cloud accounts
Resource directory member A RAM user of the management account assumes a RAM role of a member Log on to the Alibaba Cloud console by assuming a RAM role
Resource directory member Log on as a RAM user of a member Sign in to the Alibaba Cloud Management Console as a RAM user
Root user (not recommended) Log on as an Alibaba Cloud account Log in to the Alibaba Cloud Management Console as the root user
CloudSSO user Log on by assuming a RAM role Use CloudSSO to centrally manage identities and permissions across multiple enterprise accounts
CloudSSO user Log on as a RAM user

Custom policy examples

All examples below follow the same pattern: start with read-only permissions for Realtime Compute for Apache Flink, then add the specific actions required for the task.

Example 1: A RAM user activates a subscription workspace

To allow a RAM user to activate a subscription Realtime Compute for Apache Flink workspace that uses fully managed storage and free monitoring, include the following permissions in a custom policy:

  • Read-only permissions for Realtime Compute for Apache Flink: stream:Describe*, stream:Query*, stream:Check*, stream:List*, stream:Get*, stream:Search*, and stream:HasStreamDefaultRole

  • Permission to purchase a workspace: stream:CreateVvpInstance

  • Permission to query existing VPCs: vpc:DescribeVpcs

  • Permission to query existing vSwitches: vpc:DescribeVSwitches

  • Permissions to view and pay for orders in User Center: bss:DescribeOrderList, bss:DescribeOrderDetail, bss:PayOrder, and bss:CancelOrder

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:Describe*",
        "stream:Query*",
        "stream:Check*",
        "stream:List*",
        "stream:Get*",
        "stream:Search*",
        "stream:HasStreamDefaultRole",
        "stream:CreateVvpInstance",
        "vpc:DescribeVpcs",
        "vpc:DescribeVSwitches",
        "bss:DescribeOrderList",
        "bss:DescribeOrderDetail",
        "bss:PayOrder",
        "bss:CancelOrder"
      ],
      "Resource": "*",
      "Effect": "Allow"
    }
  ]
}

Example 2: A RAM user activates a subscription workspace (with AliyunStreamFullAccess already attached)

If a RAM user already has the AliyunStreamFullAccess system policy, add a custom policy with the following permissions to allow the user to activate a subscription workspace that uses fully managed storage and free monitoring:

  • Permission to query existing VPCs: vpc:DescribeVpcs

  • Permission to query existing vSwitches: vpc:DescribeVSwitches

  • Permissions to view and pay for orders in User Center: bss:DescribeOrderList, bss:DescribeOrderDetail, bss:PayOrder, and bss:CancelOrder

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "vpc:DescribeVpcs",
        "vpc:DescribeVSwitches",
        "bss:DescribeOrderList",
        "bss:DescribeOrderDetail",
        "bss:PayOrder",
        "bss:CancelOrder"
      ],
      "Resource": "*",
      "Effect": "Allow"
    }
  ]
}

Example 3: A RAM user releases a subscription workspace

To allow a RAM user to release a subscription Flink workspace, include the following permissions:

  • Read-only permissions for Realtime Compute for Apache Flink: stream:Describe*, stream:Query*, stream:Check*, stream:List*, stream:Get*, stream:Search*, and stream:HasStreamDefaultRole

  • Permission to unsubscribe from orders in User Center: bss:Describe* and bss:Refund*

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:Describe*",
        "stream:Query*",
        "stream:Check*",
        "stream:List*",
        "stream:Get*",
        "stream:Search*",
        "stream:HasStreamDefaultRole",
        "bss:Describe*",
        "bss:Refund*"
      ],
      "Resource": "*",
      "Effect": "Allow"
    }
  ]
}

Example 4: A RAM user releases a pay-as-you-go workspace

To allow a RAM user to release a pay-as-you-go Flink workspace, include the following permissions:

  • Read-only permissions for Realtime Compute for Apache Flink: stream:Describe*, stream:Query*, stream:Check*, stream:List*, stream:Get*, stream:Search*, and stream:HasStreamDefaultRole

  • Permission to release a workspace: stream:DeleteVvpInstance

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:Describe*",
        "stream:Query*",
        "stream:Check*",
        "stream:List*",
        "stream:Get*",
        "stream:Search*",
        "stream:HasStreamDefaultRole",
        "stream:DeleteVvpInstance"
      ],
      "Resource": "*",
      "Effect": "Allow"
    }
  ]
}

Example 5: A RAM user allocates resources for a project

To allow a RAM user to allocate resources for a subscription Realtime Compute for Apache Flink project, include the following permissions:

  • Read-only permissions for Realtime Compute for Apache Flink: stream:Describe*, stream:Query*, stream:Check*, stream:List*, stream:Get*, stream:Search*, and stream:HasStreamDefaultRole

  • Permission to change the resources of a subscription project: stream:ModifyVvpNamespaceSpec

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:Describe*",
        "stream:Query*",
        "stream:Check*",
        "stream:List*",
        "stream:Get*",
        "stream:Search*",
        "stream:HasStreamDefaultRole",
        "stream:ModifyVvpNamespaceSpec"
      ],
      "Resource": "*",
      "Effect": "Allow"
    }
  ]
}

Custom policies

Realtime Compute for Apache Flink permission policies

Important

Before configuring permissions for a project, configure the DescribeVvpInstances permission first. Without it, a permission error occurs when users try to view workspaces.

Flink workspaces

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:CreateVvpInstance",
        "stream:DescribeVvpInstances",
        "stream:DeleteVvpInstance",
        "stream:RenewVvpInstance",
        "stream:ModifyVvpPrepayInstanceSpec",
        "stream:ModifyVvpInstanceSpec",
        "stream:ConvertVvpInstance",
        "stream:QueryCreateVvpInstance",
        "stream:QueryRenewVvpInstance",
        "stream:QueryModifyVvpPrepayInstanceSpec",
        "stream:QueryConvertVvpInstance"
      ],
      "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#InstanceId}",
      "Effect": "Allow"
    }
  ]
}
Action Description
CreateVvpInstance Purchase a Realtime Compute for Apache Flink workspace.
DescribeVvpInstances View workspaces.
DeleteVvpInstance Release a Flink workspace.
RenewVvpInstance Renew a subscription workspace.
ModifyVvpPrepayInstanceSpec Scale a subscription workspace.
ModifyVvpInstanceSpec Adjust the quota of a pay-as-you-go workspace.
ConvertVvpInstance Change the billing method of a workspace.
QueryCreateVvpInstance Query the price for creating a workspace.
QueryRenewVvpInstance Query the price for renewing a workspace.
QueryModifyVvpPrepayInstanceSpec Query the price for scaling a workspace.
QueryConvertVvpInstance Query the price for changing the billing method from pay-as-you-go to subscription.
For purchasing and viewing workspaces, change "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#instanceId}" to "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/*".

Flink projects

{
  "Version": "1",
  "Statement": [
    {
      "Action": [
        "stream:CreateVvpNamespace",
        "stream:DeleteVvpNamespace",
        "stream:ModifyVvpPrepayNamespaceSpec",
        "stream:ModifyVvpNamespaceSpec",
        "stream:DescribeVvpNamespaces"
      ],
      "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#instanceId}/vvpnamespace/{#namespace}",
      "Effect": "Allow"
    }
  ]
}
Action Description
CreateVvpNamespace Create a project.
DeleteVvpNamespace Delete a project.
ModifyVvpPrepayNamespaceSpec Change the resources of a subscription project.
ModifyVvpNamespaceSpec Change the resources of a pay-as-you-go project.
DescribeVvpNamespaces View a list of projects. After you configure this policy, click the expand icon to the left of a destination workspace ID to view the list of projects in the workspace. To access the development console of a destination project, you must also have permissions to develop jobs in that project. For more information, see Development console authorization.
For creating and viewing projects, change "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#instanceId}/vvpnamespace/{#namespace}" to "Resource": "acs:stream:{#regionId}:{#accountId}:vvpinstance/{#instanceId}/vvpnamespace/*".

Permission operations for related products

ECS

To access the development console from the internet, you must activate an Elastic IP Address (EIP). To connect to resources in a Virtual Private Cloud (VPC), Flink creates elastic network interfaces (ENIs) in your VPC and adds them to a dedicated security group for serverless Flink. Flink requires the following ECS permissions to manage these resources.

Action Description
ecs:AssociateEipAddress Request an EIP address to access the Flink service over the public network.
ecs:AttachNetworkInterface Attach your ENI to a Flink resource pool.
ecs:AuthorizeSecurityGroup Add an inbound rule to the security group that Flink creates.
ecs:AuthorizeSecurityGroupEgress Add an outbound rule to the security group that Flink creates.
ecs:CreateNetworkInterface Create an ENI in your VPC to support the connection from Flink to your VPC.
ecs:CreateNetworkInterfacePermission Grant permissions on ENIs to the Flink service.
ecs:CreateSecurityGroup Create the security group that Flink uses.
ecs:DeleteNetworkInterface Delete ENIs of related resources after a Flink task completes.
ecs:DeleteNetworkInterfacePermission Allow the Flink service to detach your ENI.
ecs:DeleteSecurityGroup Delete the security group that Flink created.
ecs:DescribeNetworkInterfacePermissions Allow your ENI to be detached from a serverless Flink resource pool.
ecs:DescribeNetworkInterfaces Query ENIs.
ecs:DescribeSecurityGroupAttribute Query the rules of a security group.
ecs:DescribeSecurityGroupReferences Query security groups and security group-level authorization behaviors.
ecs:DescribeSecurityGroups Query the basic information about created security groups.
ecs:DetachNetworkInterface Detach your ENI from a Flink resource pool.
ecs:JoinSecurityGroup Add an ENI to a specified security group.
ecs:LeaveSecurityGroup Remove an ENI from a specified security group.
ecs:ModifyNetworkInterfaceAttribute Modify the name, description, and security group of an ENI.
ecs:ModifySecurityGroupAttribute Modify the name or description of a security group.
ecs:ModifySecurityGroupPolicy Modify the connectivity policy in a security group.
ecs:ModifySecurityGroupRule Modify the description of an inbound security group rule.
ecs:RevokeSecurityGroup Delete an inbound security group rule.
ecs:RevokeSecurityGroupEgress Delete an outbound security group rule.
ecs:UnassociateEipAddress Release an EIP.

OSS

To view the list of OSS buckets, grant the following OSS permissions.

Action Description
oss:ListBuckets View the list of OSS buckets.
oss:GetBucketInfo Get information about a bucket.
oss:GetObjectMetadata Get the metadata of a file.
oss:GetObject Get a file.
oss:ListObjects List all objects in a bucket.
oss:PutObject Upload a file.
oss:CopyObject Copy files (objects) within and between buckets in the same region.
oss:CompleteMultipartUpload Complete the multipart upload of a file after all parts are uploaded.
oss:AbortMultipartUpload Cancel a multipart upload event and delete the corresponding part data.
oss:InitiateMultipartUpload Initialize a multipart upload event before data transmission.
oss:UploadPartCopy Copy data from an existing object to upload a part.
oss:UploadPart Upload data by part based on the specified object name and upload ID.
oss:DeleteObject Delete a file (object).
oss:PutBucketcors Set cross-origin resource sharing (CORS) rules for a specified bucket.
oss:GetBucketCors Get the current CORS rules of a specified bucket.
oss:PutBucket Create a bucket.
If you use the Key Management Service (KMS) encryption feature of OSS, add KMS-related access policies to the AliyunStreamAsiDefaultRole role. For more information, see Upload a file to a bucket for which default encryption is configured.

Application Real-Time Monitoring Service (ARMS)

Flink metrics are stored in Application Real-Time Monitoring Service (ARMS), which is activated automatically when you use Flink.

Action Description
arms:ListDashboards View ARMS dashboard information.
arms:CreateContact Create a contact.
arms:DeleteContact Delete a contact.
arms:SearchContact Search for a contact.
arms:UpdateContact Update a contact.
arms:CreateContactGroup Create a contact group.
arms:DeleteContactGroup Delete a contact group.
arms:SearchContactGroup Search for a contact group.
arms:UpdateContactGroup Update a contact group.
arms:SearchAlertRules Search for alert rules.
arms:CreateAlertRules Create alert rules.
arms:UpdateAlertRules Update alert rules.
arms:DeleteAlertRules Delete alert rules.
arms:StartAlertRule Start an alert rule.
arms:StopAlertRule Pause an alert rule.
arms:SearchAlarmHistories View historical alert information.
arms:OpenArmsService Activate the ARMS service.
arms:CreateWebhook Create a webhook.
arms:UpdateWebhook Update a webhook.
arms:CreateDispatchRule Create a scheduling rule.
arms:ListDispatchRule View the list of dispatch rules.
arms:DeleteDispatchRule Delete a scheduling rule.
arms:UpdateDispatchRule Update a scheduling rule.
arms:DescribeDispatchRule View the details of a scheduling rule.
arms:GetAlarmHistories Get alert sending history.
arms:SaveAlert Save an alert rule.
arms:DeleteAlert Delete an alert rule.
arms:GetAlert Get an alert rule.
arms:CheckServiceStatus Check the service activation status.
arms:InstallManagedPrometheus Create a managed Prometheus instance.
arms:UninstallManagedPrometheus Uninstall a managed Prometheus instance.
arms:GetManagedPrometheusStatus Get the installation status of a managed Prometheus instance.

VPC

The following VPC permissions are required during workspace activation to query and create VPC resources.

Action Description
vpc:DescribeVpcAttribute Query the configuration of a specified VPC.
vpc:DescribeVpcs Query created VPCs.
vpc:DescribeVSwitchAttributes Query information about a specified vSwitch.
vpc:DescribeVSwitches Query created vSwitches.
vpc:DescribeRouteTableList Query a list of route tables.
vpc:DescribeRouteTables Query a specified route table.
vpc:DescribeRouteEntryList Query a list of route entries.
vpc:DescribeRouterInterfaceAttribute Query router interface configurations.
vpc:DescribeRouterInterfaces Query router interfaces.
vpc:DescribeVRouters Query a list of vRouters in a specified region.
vpc:CreateVpc Create a VPC.
vpc:CreateVSwitch Create a vSwitch.

RAM

The following RAM permission is required during workspace activation for resource configuration.

Action Description
ram:* Add, delete, modify, and query domain and application RAM resources.

TAG

Action Description
tag:ListTagResources Query a list of resource tags.
tag:ListTagKeys Query a list of tag keys.
tag:ListTagValues Query the tag values corresponding to a specified tag key.

Data Lake Formation (DLF)

The following Data Lake Formation (DLF) permissions are required to access DLF-related catalogs during workspace activation.

Action Description
dlf:BatchCreatePartitions Create multiple partitions at a time.
dlf:BatchCreateTables Create multiple tables at a time.
dlf:BatchDeletePartitions Delete multiple partitions at a time.
dlf:BatchDeleteTables Delete multiple tables at a time.
dlf:BatchGetPartitions Get multiple partitions at a time.
dlf:BatchGetTables Get multiple tables at a time.
dlf:BatchUpdatePartitions Update multiple partitions at a time.
dlf:BatchUpdateTables Update multiple tables at a time.
dlf:CreateCatalog Create a data lake catalog.
dlf:CreateDatabase Create a database.
dlf:CreateFunction Create a function.
dlf:CreatePartition Create a partition.
dlf:CreateTable Create a table.
dlf:DeleteCatalog Delete a data lake catalog.
dlf:DeleteDatabase Delete a database.
dlf:DeleteFunction Delete a function.
dlf:DeletePartition Delete a partition.
dlf:DeleteTable Delete a table.
dlf:GetAsyncTaskStatus Get the status of an asynchronous task.
dlf:GetCatalog Get a data lake catalog.
dlf:GetCatalogByInstanceId Get a catalog by instance ID.
dlf:GetCatalogSettings Get the configurations of a data lake.
dlf:GetDatabase Get a database.
dlf:GetFunction Get a function.
dlf:GetPartition Get a partition.
dlf:GetTable Get a table.
dlf:ListCatalogs Get a list of catalogs.
dlf:ListDatabases Get a list of databases.
dlf:ListFunctionNames Get a list of function names.
dlf:ListFunctions Get a list of functions.
dlf:ListPartitionNames Get a list of partition names.
dlf:ListPartitions Get a list of partitions.
dlf:ListPartitionsByExpr Get a list of partitions by expression.
dlf:ListPartitionsByFilter Get a list of partitions by filter.
dlf:ListTableNames Get a list of table names.
dlf:ListTables Get a list of tables.
dlf:RenamePartition Rename a partition.
dlf:RenameTable Rename a table.
dlf:UpdateCatalog Update a data lake catalog.
dlf:UpdateDatabase Update a database.
dlf:UpdateFunction Update a function.
dlf:UpdateTable Update a table.
dlf:BatchGetPartitionColumnStatistics Get the statistics of metadata partitions at a time.
dlf:DeletePartitionColumnStatistics Delete the statistics of a metadata table partition.
dlf:DeleteTableColumnStatistics Delete the statistics of a metadata table.
dlf:GetPartitionColumnStatistics Get the statistics of a metadata partition field.
dlf:GetTableColumnStatistics Get the statistics of a metadata table field.
dlf:UpdateTableColumnStatistics Update the statistics of a metadata table.
dlf:UpdatePartitionColumnStatistics Update the statistics of a metadata table partition.
dlf:CreateLock Create a metadata lock.
dlf:UnLock Unlock a specified metadata lock.
dlf:AbortLock Abort a metadata lock.
dlf:RefreshLock Refresh a metadata lock.
dlf:GetLock Query a metadata lock.
dlf:GetCatalogAccessInfo Use a CatalogUuid to get backend storage information such as StorageName and StorageEndpoint.
dlf:GetDataToken Use a UUID to get a catalog-level or table-level data key.
dlf:GetDataTokenByName Use a CatalogUuid, DatabaseName, and TableName to get a catalog-level or table-level data key.
dlf-auth:ActOnBehalfOfAnotherUser Pass through an identity. A service-linked role (SLR) or service role (SR) accesses DLF on behalf of another user.
dlf:GrantPermissions Grant a principal permissions on resources.
dlf:RevokePermissions Revoke permissions on resources from a principal.
dlf:BatchGrantPermissions Grant permissions in a batch.
dlf:BatchRevokePermissions Revoke permissions in a batch.
dlf:UpdatePermissions Update the permissions of a principal on resources.
dlf:ListPermissions Get the permission information of a specified resource or principal.
dlf:CreateRole Create a role.
dlf:UpdateRole Update a role.
dlf:DeleteRole Delete a role.
dlf:GetRole Get a role.
dlf:ListRoles Query a list of roles.
dlf:GrantRolesToUser Grant multiple role permissions to a specified user at a time.
dlf:RevokeRolesFromUser Revoke multiple role permissions from a specified user at a time.
dlf:GrantRoleToUsers Grant a specified role permission to multiple users at a time.
dlf:RevokeRoleFromUsers Revoke a specified role permission from multiple users at a time.
dlf:UpdateRoleUsers Update the users in a role.
dlf:ListRoleUsers Query a list of users in a role.
dlf:ListUserRoles Query a list of user roles.
dlf:GrantRolesToPrincipal Grant multiple role permissions to a specified principal at a time.
dlf:RevokeRolesFromPrincipal Revoke multiple role permissions from a specified principal at a time.
dlf:GrantRoleToPrincipals Grant a specified role permission to multiple principals at a time.
dlf:RevokeRoleFromPrincipals Revoke a specified role permission from multiple principals at a time.
dlf:UpdateRolePrincipals Update the principals in a role.
dlf:BatchDeleteRoles Delete multiple roles at a time.
dlf:CheckPermissions Check permissions.
dlf:GetCatalogStorageStatistics Get catalog storage statistics.
dlf:GetCatalogStorageIndicatorDetails Get catalog metric trends.
dlf:GetCatalogStorageRank Get catalog storage statistics rankings.
dlf:GetCatalogStorageAnalysis Get catalog storage distribution data.
dlf:GetDatabaseProfile Get database data profiles.
dlf:GetDatabaseStorageAnalysis Get database storage distribution data.
dlf:GetTableProfile Get table data profiles.
dlf:GetTableStorageAnalysis Get table storage distribution data.
dlf:ListPartitionProfiles Get a list of partition data profiles.
dlf:getLatestStorageStatisticsDate Get the last update time of storage overview data.
dlf:SubscribeOptimize Submit for optimization.
dlf:GetOptimizeRegionStatus Get the optimization region status.
dlf:GetOptimizeWorkspaceAuthorization Get the authentication of an optimization workspace.
dlf:AddOptimizeWorkspace Add an optimization workspace.
dlf:ListOptimizeWorkspaces Get a list of optimization workspaces.
dlf:PreCheckOptimizeWorkspaceConnection Pre-check an optimization workspace connection.
dlf:CheckOptimizeWorkspaceConnection Check an optimization workspace connection.
dlf:DeleteOptimizeWorkspace Delete an optimization workspace.
dlf:SetOptimizeEnable Set the storage optimization switch.
dlf:SetOptimizePolicy Set a storage optimization policy.
dlf:GetOptimizePolicy Get a storage optimization policy.
dlf:SetOptimizeScheduleRule Add a storage optimization scheduling rule.
dlf:ListOptimizeScheduleRules Get a list of optimization schedules.
dlf:DeleteOptimizeScheduleRule Delete a storage optimization scheduling rule.
dlf:RunOptimizeImmediately Run storage management optimization immediately.
dlf:GetOptimizeInfo Get optimization information.
dlf:UpdateOptimizeTaskResult Update a storage optimization task result.
dlf:BatchDeleteTableVersions Delete specified versions of a Data Lake table at a time.
dlf:DeleteTableVersion Delete a specified version of a Data Lake table.
dlf:GetTableVersion Get a specified version of a Data Lake table.
dlf:ListTableVersions Query a list of specified versions of a Data Lake table by page.
dlf:Search Retrieve metadata.
dlf:SearchAcrossCatalog Search for content such as databases, tables, and fields across catalogs.
dlf:GetServiceStatus Get the service activation status of a user for Data Lake Formation.
dlf:GetRegionStatus Get the service activation status of Data Lake Formation in a specified region.

What's next