All Products
Search
Document Center

Data Management:Query and analyze operation logs of DMS in Simple Log Service

Last Updated:Dec 02, 2025

This topic describes how to query and analyze Data Management (DMS) operation logs in Simple Log Service (SLS).

Background information

Operation logs are sequential records of all operations that users perform in DMS. These logs contain information such as user details, the names of the feature modules used, the time of operations, the types of operations, and the SQL statements that are run. For more information, see Features.

Prerequisites

Billing method

You can export operation logs of DMS to Simple Log Service free of charge. However, Simple Log Service charges you for storage. For more information about the billing of Simple Log Service, see Billable items of pay-by-feature.

Resources

  • Custom Simple Log Service projects and Logstores

    Important
    • Do not delete the Simple Log Service project and Logstore that are related to DMS operation logs. Otherwise, logs cannot be collected to Simple Log Service.

    • When you create a custom Logstore, different billing items are generated based on the billing method.

  • Exclusive dashboard

    No exclusive dashboard is available.

Procedure

Step 1: Register the project with DMS

  1. Log on to the DMS console V5.0 as an administrator.

  2. On the console home page, in the Database Instances area, click the image..png icon.

    Note

    If you use the DMS console in Simple mode, click Database Instances in the navigation pane on the left. In the Database Instances section, click image..png.

  3. On the Add Instance page, enter the SLS information.

    Category

    Parameter

    Description

    Data Source

    -

    Select Alibaba Cloud.

    Basic Information

    File and Log Storage

    Select SLS.

    Instance Region

    Select the region where the SLS project is located.

    Connection Method

    The default value is Connection String Address.

    Connection String Address

    After you select an Instance Region, a connection string is automatically generated.

    Project name

    Enter the name of the project that you created in SLS.

    AccessKey ID

    Enter the AccessKey ID of your Alibaba Cloud account. The AccessKey ID is used for identity verification when you register the project with DMS.

    Note

    For more information about how to obtain an AccessKey ID, see Create an AccessKey pair.

    AccessKey Secret

    Enter the AccessKey secret that corresponds to the AccessKey ID.

    Note

    For more information about how to obtain an AccessKey secret, see Create an AccessKey pair.

    Advanced Feature Pack

    This data source does not support feature packages. By default, it supports the Flexible Management mode.

    Advanced Information

    Environment Type

    Select an environment type. Valid values are Dev, Test, Production, Pre-release, SIT, UAT, Stress Testing, and STAG. For more information, see Instance environment types.

    Instance Name

    Specify a custom name for the SLS project to be displayed in DMS.

    Note

    You can change the instance name by editing the instance. For more information, see Edit instance information.

    DBA

    Select a database administrator (DBA) for the instance. The DBA can be used for subsequent processes such as permission applications.

    Query Timeout (s)

    A security policy that controls the execution time of query statements to protect the database.

    Export Timeout (s)(s)

    A security policy that controls the execution time of export statements to protect the database.

    Note
    • After configuring the basic information, click Test Connection at the bottom of the page and wait for the test to pass.

    • If the error message "The execution result of the 'getProject' command is null" appears, confirm that the project is created by the Alibaba Cloud account that you use to log on to DMS.

  4. Click Submit.

Step 2: Create a task in DMS to export operation logs

  1. Log on to the DMS console V5.0.
  2. Move the pointer over the 2023-01-28_15-57-17.png icon in the upper-left corner and choose All Features > Security and Disaster Recovery (former DBS) > Operation Audit.

    Note

    If you use the DMS console in normal mode, choose Security and Disaster Recovery (former DBS) > Operation Audit in the top navigation bar.

  3. Click the Export logs tab. Then, in the upper-right corner, click New Task.

  4. In the New Export Task dialog box, configure the following parameters.

    Parameter

    Required

    Description

    Task Name

    Yes

    The name of the export task. Specify a descriptive name for easy identification.

    Destination Log Service

    Yes

    The Simple Log Service project, which is a resource management unit.

    SLS Logstore

    Yes

    The Logstore that stores the exported DMS operation logs. Click the input box for this parameter and select the destination Logstore.

    Note

    If the destination Logstore is not in the drop-down list, click Sync Dictionary and then click OK. DMS automatically collects the metadata of the Logstore.

    Feature Module

    Yes

    Select the DMS feature modules whose logs you want to export. These modules correspond to the modules on the Operation Logs tab. The modules include features such as instance management, user management, permissions, and data query in the SQL Window. By default, logs of All Features are exported.

    Scheduling Method

    Yes

    Select a scheduling method for the task.

    • One-time Tasks: After the export task is created, it runs only once.

    • Periodic Tasks: You can select Day, Week, or Month to export logs to the Logstore multiple times in a loop. The first time a recurring task runs, it exports all DMS operation logs generated from the log start time to the first scheduled start time. Subsequent runs export only incremental logs. For more information, see Periodic scheduling.

    Log Time Range

    No

    Note

    This parameter is available only when you set Scheduling Method to One-time Tasks.

    Export logs within a specific time range. If you do not configure this parameter, logs from the last three years are exported by default.

    Log Start Time

    No

    Note
    • This parameter is available only when you set Scheduling Method to Periodic Tasks.

    • Periodic tasks do not have an end time.

    The start time of the DMS log records. If you do not configure this parameter, the default value is three years before the task creation time.

  5. Click OK. A log export task is created. The system also creates index fields, such as dbId, dbName, and dbUser, in your Logstore for subsequent data queries and analysis.

    • A one-time task exports logs only once. When the task status is Successful, the logs are exported.

      Note

      Because the Logstore index takes effect after a delay, a one-time task starts approximately 90 seconds after it is created.

    • A periodic task exports logs multiple times. The task status is Pending Scheduling both before and after the export. You can view the task logs to determine whether a task run is successful.

    You can also perform the following operations in the Actions column for the task.

    • Query: Click Query. You are redirected to the SQL Console page. Click Query. In the execution result section at the bottom of the page, you can view the logs that are exported to the Logstore.

    • Task Logs: Click Task Logs to view information such as the task start and end times, the number of delivered logs, and the task status.

    • Pause: Click Pause. In the dialog box that appears, click OK. The recurring task is paused.

    • Restart: Click Restart. In the dialog box that appears, click OK to restart a paused periodic task.

      Note
      • The restart operation is not supported for one-time tasks. However, other operations are supported.

      • All operations, such as query and pause, are supported for periodic tasks.

For more information about how to create an export task, see Export DMS operation logs to Simple Log Service.

Step 3: Query and analyze the exported DMS operation logs in the SLS console

  1. Log on to the Simple Log Service console.

  2. In the Projects section, click the one you want.

    image

  3. On the Log Storage > Logstores tab, click the logstore you want.

    image

  4. In the search box, enter a query and analysis statement.

    A query and analysis statement consists of a search statement and an analytic statement in the format of Search statement|Analytic statement. For more information about the syntax, see Search syntax and functions and SQL analytic functions.

    You can query and analyze the following information in SLS:

    Note

    The dmstest Logstore is used as an example.

    • Users who failed to log on to a database most frequently.

      __topic__ : DMS_LOG_DELIVERY AND subModule : LOGIN | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest WHERE state = '0' GROUP BY operUserId, operUserName ORDER BY cnt DESC LIMIT 10;

    • Users with an abnormal source IP address. The IP address 127.0.0.1 is used as an example.

      Note

      The source IP address of an instance is your local IP address when you register the instance with DMS. This address identifies the source of the instance access.

      __topic__ : DMS_LOG_DELIVERY | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest WHERE state = '0' and requestIp in ('127.0.0.1') GROUP BY operUserId, operUserName ORDER BY cnt DESC LIMIT 10;

    • The user who accessed DMS most frequently.

      __topic__ : DMS_LOG_DELIVERY| SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest GROUP BY operUserId, operUserName ORDER BY cnt DESC LIMIT 10;

    • Users who accessed and operated multiple databases on the same day.

      __topic__: DMS_LOG_DELIVERY | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, date_trunc('day', gmtCreate) time, dbId, COUNT(*) qpd from dmstest GROUP BY time, operUserId, operUserName, dbId ORDER BY time, qpd DESC;

    • Users who failed to perform database operations in DMS.

      __topic__ : DMS_LOG_DELIVERY AND moudleName : SQL_CONSOLE | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, actionDesc as sqlStatement, subModule as sqlType, remark as failReason FROM dmstest WHERE state = '-1' order by id;

    • Users who downloaded sensitive data most frequently.

      __topic__ : DMS_LOG_DELIVERY AND moudleName : DATA_EXPORT | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest WHERE hasSensitiveData = 'true' GROUP BY operUserId, operUserName ORDER BY cnt DESC LIMIT 10;
    • SQL statements that are run for batch operations, such as deleting and updating sensitive data.

      __topic__ : DMS_LOG_DELIVERY | SELECT subModule, COUNT(*) cnt, COUNT(affectRows) affectRow FROM dmstest WHERE subModule != '' GROUP BY subModule ORDER BY cnt DESC;

    • Whether the data watermark feature is enabled during data export.

      __topic__ : DMS_LOG_DELIVERY AND moudleName : DATA_EXPORT | SELECT targetId as orderId, concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest where actionDesc like '%Enable data watermark: false' GROUP BY targetId, operUserId, operUserName ORDER BY cnt DESC LIMIT 10;
      Note
      • To query for users who enabled the data watermark feature, use the statement: '%Enable data watermark: true'.

      • To query for users who disabled the data watermark feature, use the statement: '%Enable data watermark: false'.

    • Users who downloaded SQL result sets from the execution result section of the SQL Console page.

      __topic__ : DMS_LOG_DELIVERY AND moudleName : SQL_CONSOLE_EXPORT | SELECT concat(cast(operUserId as varchar), '(', operUserName, ')') user, COUNT(*) cnt FROM dmstest GROUP BY operUserId, operUserName ORDER BY cnt DESC LIMIT 10;

For more information about query and analysis, see Step 1: Configure indexes.

(Optional) Step 4: Pause the recurring task

Note

If you created a recurring task in Step 2 (with Scheduling Method set to Periodic Tasks) and have finished analyzing the DMS operation logs, you can pause the task.

  1. Log on to the DMS console V5.0.
  2. Move the pointer over the 2023-01-28_15-57-17.png icon in the upper-left corner and choose All Features > Security and Disaster Recovery (former DBS) > Operation Audit.

    Note

    If you use the DMS console in normal mode, choose Security and Disaster Recovery (former DBS) > Operation Audit in the top navigation bar.

  3. Click the Export Logs tab.

  4. In the Actions column for the recurring task, click Pause.

  5. In the dialog box that appears, click OK.

What to do next

After the log export task runs successfully or is paused, the project is not automatically deleted. If you have finished analyzing the DMS operation logs, go to the Simple Log Service console and delete the project that you selected for the log export to avoid unnecessary charges.

Description of raw log fields in SLS

The following table describes some important fields in a DMS operation log that is imported to SLS.

Field

Description

id

The unique ID of the log.

gmt_create

The time when the log was created.

gmt_modified

The time when the log was modified.

oper_user_id

The user ID of the operator.

oper_user_name

The name of the operator.

module_name

The exported feature module:

  • PERMISSION: permissions

  • OWNER: data owner

  • SQL_CONSOLE: SQL Window

  • DSQL_CONSOLE: cross-database data query

  • LOGICDW_CONSOLE: logical data warehouse

  • SQL_CONSOLE_EXPORT: SQL Window result export

  • DSQL_CONSOLE_EXPORT: cross-database query result export

  • DATA_CHANGE: data change

  • DATA_EXPORT: data export

  • DATA_TRACK: data tracking

  • SQL_REVIEW: SQL review

  • DT_SYNC: database and table synchronization

  • DT_DETAIL: database and table details

  • DB_TASK: task management

  • INSTANCE_MANAGE: instance management

  • USER_MANAGE: user management

  • SECURITY_RULE: security rule

  • CONFIG_MANAGE: configuration management

  • ACCESS_WHITE_IP: IP address whitelist for access

  • NDDL: schema design

  • DATASEC_MANAGE: sensitive data management

  • DATABASE_CLONE: database cloning

  • DATABASE_MANAGE: database management

  • MESSAGE_CENTER: notification management

  • VERSION_MANAGE: version management

  • OPER_AUDIT: operation audit

  • SENSITIVE_DATA_PROTECT: sensitive data protection

  • SENSITIVE_DATA_USAGE: sensitive data usage audit

  • DB_EXPORT: database export

  • WATERMARK_EXTRACT: watermark extraction

  • GENERATE: test data generation

  • CLASSIFICATION_TEMPLATE: classification template

  • SECURITY_CENTER: Data Security Center

sub_module

The sub-feature module.

For example, for SQL_CONSOLE, the sub-module refers to the type of the SQL statement that a user runs.

db_id

The ID of the database on which the operation is performed. This is the ID in DMS.

db_name

The name of the database on which the operation is performed.

is_logic_db

Indicates whether the database is a logical database.

instance_id

The ID of the instance on which the operation is performed. This is the ID in DMS.

instance_name

The name of the instance on which the operation is performed.

action_desc

The description of the operation.

remark

The remarks.

has_sensitive_data

Indicates whether the log contains sensitive information.