Data Management (DMS) supports exporting operation logs to Alibaba Cloud Simple Log Service (SLS) for data transformation and analysis.
Prerequisites
You have activated Simple Log Service. For more information, see Activate Simple Log Service.
You have created a Simple Log Service project and a Logstore. For more information, see Manage projects and Create a basic Logstore.
You have added the project to DMS on the Instance Management page. For more information about how to add an instance, see Add an ApsaraDB instance.
The destination Logstore must be empty. You must enable either a full-text index or a field index. For more information about full-text and field indexes, see Create an index.
Background information
Operation logs record all operations that users perform in DMS. For more information, see Features.
Billing
The feature that exports DMS operation logs to SLS is free of charge.
If the Logstore uses the pay-by-feature billing method, you are charged for items such as storage space, read traffic, requests, data transformation, and data shipping after the logs are collected by Simple Log Service. For more information, see the billing documentation.
If the Logstore uses the pay-by-ingested-data billing method, you are charged for the amount of raw data ingested after the logs are collected by Simple Log Service. For more information, see the billing documentation.
Procedure
- Log on to the DMS console V5.0.
Move the pointer over the
icon in the upper-left corner and choose . NoteIf you use the DMS console in normal mode, choose in the top navigation bar.
Click the Export Logs tab. In the upper-right corner, click Create Task.
In the Create Export Task dialog box, configure the parameters.
Parameter
Required
Description
Task Name
Yes
The name of the export task. This helps you find the task later.
Destination SLS
Yes
The Simple Log Service project, which is a resource management unit.
SLS Logstore
Yes
The Logstore to which you want to export logs. This Logstore must have an index and contain no log entries.
NoteYou can click Sync Dictionary and then click Confirm. DMS automatically collects metadata from the Logstore.
Feature Module
Yes
Select the DMS feature modules whose logs you want to export. These modules correspond to the modules on the Operation logs page. Options include Unlimited, Permission, Data Owner, Data Query, Query Result Export, and Cross-database Query Result Export. The default value is Unlimited.
Scheduling Method
Yes
Select a scheduling method for this job.
One-time: The export task runs only once after it is created.
Periodic: You can select to export logs to Logstore on a Daily, Weekly, or Monthly basis. The first time a periodic task runs, all operation logs that are generated in DMS from the start time of log generation to the start time of the first scheduling are exported. Only incremental logs are exported later. For more information, see Recurring schedule.
Log Time Range
No
NoteThis parameter appears when you set Scheduling Method to One-time.
The time range of the logs to export. If you leave this blank, logs from the last three years are exported by default.
Log Start Time
No
NoteThis parameter appears when you set Scheduling Method to Recurring.
Recurring tasks do not have an end time.
The start time for DMS log recording. If you leave this blank, the default value is three years before the task creation time.
Click Confirm. An export task is created. The system also creates indexed fields, such as dbId, dbName, and dbUser, in your Logstore for query and analysis.
For a one-time task, logs are exported only once. When the task status changes to Running Successfully, the logs have been successfully exported.
NoteBecause the Logstore index takes time to become effective, the one-time task starts about 90 seconds after it is created.
For a recurring task, logs are exported multiple times. The task status is displayed as Pending both before and after each export. You can view the task logs to check whether a specific run was successful.
You can also perform the following operations in the Actions column for a task.
Query: Click Query to go to the SQL Console page. Then, click Query to view the exported logs in the execution result area at the bottom of the page.
Task Logs: Click Task Logs to view information such as the task start and end times, the number of logs delivered, and the task status.
Pause: Click Pause. In the dialog box that appears, click Confirm to pause the recurring task.
Restart: Click Restart. In the dialog box that appears, click Confirm to restart a paused recurring task.
Recurring schedule
Parameter | Description |
Scheduling Cycle | The cycle for the scheduled task:
|
Specified Time |
|
Specific Time | The specific time to run the task flow. For example, if you set this to 02:55, the system runs the task at 02:55 on the specified days. |
cron expression | You do not need to configure this manually. The system automatically displays the expression based on the cycle and time you configure. |
Code implementation
You can also export logs using Java code.
The account must have the
GetOpLogpermission for DMS, and thePutLogsandCreateIndexpermissions for SLS.For improved security, avoid hardcoding an AccessKey in your project code. For more information about how to configure credentials, see Manage access credentials.
<dependency>
<groupId>com.aliyun</groupId>
<artifactId>dms_enterprise20181101</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>com.aliyun.openservices</groupId>
<artifactId>aliyun-log</artifactId>
<version>0.6.100</version>
</dependency>package org.example;
import com.aliyun.dms_enterprise20181101.models.GetOpLogResponse;
import com.aliyun.dms_enterprise20181101.models.GetOpLogResponseBody;
import com.aliyun.openservices.log.common.LogItem;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.Optional;
public class ExportDmsOperLogExample {
private static com.aliyun.dms_enterprise20181101.Client dmsClient = null;
private static com.aliyun.openservices.log.Client slsClient = null;
public static com.aliyun.credentials.Client getCredentialClient() {
com.aliyun.credentials.models.Config credentialConfig = new com.aliyun.credentials.models.Config();
credentialConfig.setType("access_key");
// Required. This example shows how to obtain the AccessKey ID from an environment variable.
credentialConfig.setAccessKeyId(System.getenv("ALIBABA_CLOUD_ACCESS_KEY_ID"));
// Required. This example shows how to obtain the AccessKey secret from an environment variable.
credentialConfig.setAccessKeySecret(System.getenv("ALIBABA_CLOUD_ACCESS_KEY_SECRET"));
return new com.aliyun.credentials.Client(credentialConfig);
}
// Create a DMS OpenAPI client to call GetOpLog.
public static synchronized com.aliyun.dms_enterprise20181101.Client createDmsClient() throws Exception {
if (dmsClient != null) {
return dmsClient;
}
com.aliyun.teaopenapi.models.Config config = new com.aliyun.teaopenapi.models.Config()
.setCredential(getCredentialClient());
// For the endpoint, see https://api.aliyun.com/product/dms-enterprise
config.endpoint = "dms-enterprise.cn-hangzhou.aliyuncs.com";
dmsClient = new com.aliyun.dms_enterprise20181101.Client(config);
return dmsClient;
}
// Create an SLS OpenAPI client to call PutLogs.
public static synchronized com.aliyun.openservices.log.Client createSlsClient() throws Exception {
if (slsClient != null) {
return slsClient;
}
com.aliyun.credentials.Client credentialClient = getCredentialClient();
// For the endpoint, see https://api.aliyun.com/product/Sls
String endpoint = "cn-hangzhou.log.aliyuncs.com";
slsClient = new com.aliyun.openservices.log.Client(endpoint, credentialClient.getAccessKeyId(), credentialClient.getAccessKeySecret());
return slsClient;
}
// Call the DMS OpenAPI GetOpLog operation to obtain operation logs.
public static List<GetOpLogResponseBody.GetOpLogResponseBodyOpLogDetailsOpLogDetail> getLogs(com.aliyun.dms_enterprise20181101.Client client, String startTime, String endTime, Integer pageNumber, Integer pageSize) {
com.aliyun.dms_enterprise20181101.models.GetOpLogRequest getOpLogRequest = new com.aliyun.dms_enterprise20181101.models.GetOpLogRequest()
.setStartTime(startTime)
.setEndTime(endTime)
.setPageSize(pageSize)
.setPageNumber(pageNumber);
com.aliyun.teautil.models.RuntimeOptions runtime = new com.aliyun.teautil.models.RuntimeOptions();
try {
GetOpLogResponse response = client.getOpLogWithOptions(getOpLogRequest, runtime);
return Optional.ofNullable(response.getBody())
.map(GetOpLogResponseBody::getOpLogDetails)
.map(GetOpLogResponseBody.GetOpLogResponseBodyOpLogDetails::getOpLogDetail)
.orElse(new ArrayList<>());
} catch (Exception e) {
System.out.println(e.getMessage());
throw new RuntimeException(e.getMessage());
}
}
// Call the PutLogs operation to import logs to an SLS Logstore.
public static void putLogs(com.aliyun.openservices.log.Client client, String project, String logStore, List<GetOpLogResponseBody.GetOpLogResponseBodyOpLogDetailsOpLogDetail> logDetailList) {
List<LogItem> logItemList = new ArrayList<>();
for (GetOpLogResponseBody.GetOpLogResponseBodyOpLogDetailsOpLogDetail logDetail : logDetailList) {
LogItem logItem = new LogItem((int) (new Date().getTime() / 1000));
logItem.PushBack("module", logDetail.getModule());
logItem.PushBack("database", logDetail.getDatabase());
logItem.PushBack("userId", logDetail.getUserId());
logItem.PushBack("opUserId", String.valueOf(logDetail.getOpUserId()));
logItem.PushBack("userNick", logDetail.getUserNick());
logItem.PushBack("opTime", logDetail.getOpTime());
logItem.PushBack("opContent", logDetail.getOpContent());
logItem.PushBack("orderId", String.valueOf(logDetail.getOrderId()));
logItemList.add(logItem);
}
try {
client.PutLogs(project, logStore, "", logItemList, "");
} catch (Exception e) {
e.printStackTrace();
System.out.println(e.getMessage());
throw new RuntimeException(e.getMessage());
}
}
// Create an index. You only need to initialize a Logstore once.
public static void createIndex(com.aliyun.openservices.log.Client client, String project, String logStore) {
try {
GetIndexResponse getIndexResponse = client.GetIndex(project, logStore);
Index index = getIndexResponse.GetIndex();
IndexLine indexLine = index.GetLine();
List<String> indexToken = null;
if (indexLine != null) {
indexToken = indexLine.GetToken();
}
IndexKeys keys = new IndexKeys();
List<String> logStoreKeys = List.of("module", "database", "userId", "opUserId",
"userNick", "opTime", "opContent", "orderId");
for (String logStoreKey : logStoreKeys) {
IndexKey key = new IndexKey();
key.SetType("text");
key.SetChn(true);
key.SetDocValue(false);
key.SetToken(indexToken);
keys.AddKey(logStoreKey, key);
index.SetKeys(keys);
}
client.UpdateIndex(project, logStore, index);
} catch (Exception e) {
e.printStackTrace();
System.out.println(e.getMessage());
throw new RuntimeException(e.getMessage());
}
}
public static void main(String[] args) throws Exception {
// The start time of DMS operation logs (accurate to the second).
String startStr = "2025-01-10 14:30:00";
// The end time of DMS operation logs (accurate to the second).
String endStr = "2025-01-13 09:15:00";
// The name of the SLS project.
String project = "project";
// The name of the SLS Logstore.
String logStore = "logStore";
// Create an index for the SLS Logstore. You only need to create an index for a Logstore once.
createIndex(createSlsClient(), project, logStore);
DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
LocalDateTime start = LocalDateTime.parse(startStr, formatter);
LocalDateTime end = LocalDateTime.parse(endStr, formatter);
if (start.isAfter(end)) {
throw new IllegalArgumentException("The start time cannot be later than the end time.");
}
LocalDateTime current = start;
// The volume of DMS logs is large. To prevent OpenAPI request timeouts, request logs for one day at a time.
while (!current.isAfter(end)) {
// Calculate the end time of the current day: Use the earlier of 23:59:59 on the current day and the overall end time.
LocalDateTime dayEnd = current.toLocalDate().atTime(23, 59, 59);
if (dayEnd.isAfter(end)) {
dayEnd = end;
}
String segmentStartStr = current.format(formatter);
String segmentEndStr = dayEnd.format(formatter);
// Business logic: First, obtain DMS logs, and then import them to SLS.
process(segmentStartStr, segmentEndStr, project, logStore);
// Move to 00:00:00 of the next day.
current = dayEnd.toLocalDate().plusDays(1).atStartOfDay();
}
}
public static void process(String startTime, String endTime, String project, String logStore) throws Exception {
int pageNumber = 1;
int pageSize = 100;
com.aliyun.dms_enterprise20181101.Client dmsClient = createDmsClient();
com.aliyun.openservices.log.Client slsClient = createSlsClient();
List<GetOpLogResponseBody.GetOpLogResponseBodyOpLogDetailsOpLogDetail> logs = getLogs(dmsClient, startTime, endTime, pageNumber, pageSize);
while(logs.size() >= 100) {
putLogs(slsClient, project, logStore, logs);
pageNumber++;
logs = getLogs(dmsClient, startTime, endTime, pageNumber, pageSize);
}
}
}
References
After you export DMS operation logs to SLS, you can query and analyze them. For more information, see Query and analyze DMS operation logs in Simple Log Service.