MaxCompute Query Acceleration (MCQA) is a built-in feature of MaxCompute. MCQA uses the native MaxCompute SQL language and supports MaxCompute built-in functions and permission systems. This topic describes how to use the MCQA feature.
Background information
- Use the MaxCompute client. For more information, see Enable MCQA on the MaxCompute client.
- Use the ad hoc query or data analytics feature of DataWorks. By default, the MCQA feature is enabled for ad hoc query or data analytics of DataWorks. For more information, see Enable MCQA for ad hoc query or data analytics of DataWorks.
- Use the MaxCompute JDBC driver. For more information, see JDBC.
- Use an Alibaba Cloud MaxCompute SDK. This method requires POM dependency. For more information, see Enable MCQA based on Alibaba Cloud MaxCompute SDK for Java.
Enable MCQA on the MaxCompute client
To enable the MCQA feature on the MaxCompute client, perform the following steps:
Enable MCQA for ad hoc query or data analytics of DataWorks
By default, the MCQA feature is enabled for the Ad-Hoc Query and Manually Triggered Workflow modules of DataWorks. If you want to disable the MCQA feature,submit a ticket.


JDBC
- Use the MaxCompute JDBC driver to connect to MaxCompute. For more information, see Usage notes of JDBC. To enable the MCQA feature, you must modify the relevant configuration. For more information, see Enable MCQA on MaxCompute JDBC.
- Use the MaxCompute JDBC driver to connect to Tableau. Then, you can use Tableau to analyze MaxCompute data in a visualized manner. For more information, see Configure MaxCompute JDBC on Tableau. To enable the MCQA feature in this scenario, you must modify the relevant configuration. For more information, see Enable MCQA on Tableau based on MaxCompute JDBC.
- Use the MaxCompute JDBC driver to connect to SQL Workbench/J. Then, you can use SQL Workbench/J to execute SQL statements on MaxCompute data. For more information, see Configure MaxCompute JDBC on SQL Workbench/J. To enable the MCQA feature in this scenario, you must modify the relevant configuration. For more information, see Enable MCQA on SQL Workbench/J based on MaxCompute JDBC.
Enable MCQA on MaxCompute JDBC
If you use the MaxCompute JDBC driver to connect to MaxCompute, perform the following steps to enable the MCQA feature. For more information, see Usage notes of JDBC.
Enable MCQA on Tableau based on MaxCompute JDBC
Add the interactiveMode=true
property to enable the MCQA feature for a specified server. We recommend that you also add the enableOdpsLogger=true
property to display logs. For more information, see Configure MaxCompute JDBC on Tableau.
http://service.cn-beijing.maxcompute.aliyun.com/api?project=****_beijing&interactiveMode=true&enableOdpsLogger=true
To perform operations on specific tables in a MaxCompute project, add the table_list=table_name1, table_name2
property in the parameters of the server. Separate table names with commas (,). An excessive number of tables may cause Tableau
to open slowly. We recommend that you use this method to load only the required tables.
The following example shows how to load specified tables in Tableau:http://service.cn-beijing.maxcompute.aliyun.com/api?project=****_beijing&interactiveMode=true&enableOdpsLogger=true&table_list=orders,customers
For tables that includes a large number of partitions, we recommend that you do not
use data from all partitions as the data source. You can filter the required partitions
or run custom SQL queries to obtain the required data.Enable MCQA on SQL Workbench/J based on MaxCompute JDBC
After you configure the MaxCompute JDBC driver, modify the JDBC URL that you specified on the profile configuration page. Then, you can use the MCQA feature on SQL Workbench/J. For more information about how to configure the profile, see Configure MaxCompute JDBC on SQL Workbench/J.
jdbc:odps:<MaxCompute_endpoint>? project=<MaxCompute_project_name>&accessId=<AccessKey
ID>&accessKey=<AccessKey Secret>&charset=UTF-8&interactiveMode=true
. Parameter description:
- maxcompute_endpoint: the endpoint of the region where your MaxCompute service is deployed. For more information, see Configure endpoints.
- maxcompute_project_name: the name of your MaxCompute project.
- AccessKey ID: the AccessKey ID that has the permissions to access the specified project.
- AccessKey Secret: the AccessKey secret that corresponds to the AccessKey ID.
- charset=UTF-8: the character set encoding format.
- interactiveMode: specifies whether to enable the MCQA feature. To enable the MCQA feature, set this
parameter to
true
.
Enable MCQA based on Alibaba Cloud MaxCompute SDK for Java
<dependency>
<groupId>com.aliyun.odps</groupId>
<artifactId>odps-sdk-core</artifactId>
<version>0.35.5-public</version>
</dependency>
import com.aliyun.odps.Odps;
import com.aliyun.odps.OdpsException;
import com.aliyun.odps.OdpsType;
import com.aliyun.odps.account.Account;
import com.aliyun.odps.account.AliyunAccount;
import com.aliyun.odps.data.Record;
import com.aliyun.odps.data.ResultSet;
import com.aliyun.odps.sqa.*;
import java.io.IOException;
import java.util.*;
public class SQLExecutorExample {
public static void SimpleExample() {
// Specify the account and project information.
Account account = new AliyunAccount("your_access_id", "your_access_key");
Odps odps = new Odps(account);
odps.setDefaultProject("your_project_name");
odps.setEndpoint("http://service.<regionid>.maxcompute.aliyun.com/api");
// Prepare to create SQL executors.
SQLExecutorBuilder builder = SQLExecutorBuilder.builder();
SQLExecutor sqlExecutor = null;
try {
// run in offline mode or run in interactive mode
if (false) {
// Create an executor that runs offline SQL queries by default.
sqlExecutor = builder.odps(odps).executeMode(ExecuteMode.OFFLINE).build();
} else {
// Create an executor that runs MCQA SQL queries by default.
sqlExecutor = builder.odps(odps).executeMode(ExecuteMode.INTERACTIVE).build();
}
// Pass special query settings if required.
Map<String, String> queryHint = new HashMap<>();
queryHint.put("odps.sql.mapper.split.size", "128");
// Submit a query job. You can pass hints.
sqlExecutor.run("select count(1) from test_table;", queryHint);
// List some supported API operations that are used to query information.
// UUID
System.out.println("ExecutorId:" + sqlExecutor.getId());
// Query the Logview of the current query job.
System.out.println("Logview:" + sqlExecutor.getLogView());
// Query the instance on which the current query job is run. In interactive mode, multiple query jobs may be run on the same instance.
System.out.println("InstanceId:" + sqlExecutor.getInstance().getId());
// Query the progress of the current query job. You can check the progress bar in the console.
System.out.println("QueryStageProgress:" + sqlExecutor.getProgress());
// Query the change logs about the execution status for the current query job, such as rollback messages.
System.out.println("QueryExecutionLog:" + sqlExecutor.getExecutionLog());
// Provide two API operations to query results.
if(false) {
// Query the results of all query jobs. This is a synchronous operation and may occupy this thread until the query succeeds or fails.
List<Record> records = sqlExecutor.getResult();
printRecords(records);
} else {
// Query the ResultSet iterator of the query results. This is a synchronous operation and may occupy this thread until the query succeeds or fails.
ResultSet resultSet = sqlExecutor.getResultSet();
while (resultSet.hasNext()) {
printRecord(resultSet.next());
}
}
// run another query
sqlExecutor.run("select * from test_table;", new HashMap<>());
if(false) {
// Query all query results. This is a synchronous operation and may occupy this thread until the query succeeds or fails.
List<Record> records = sqlExecutor.getResult();
printRecords(records);
} else {
// Query the ResultSet iterator of the query results. This is a synchronous operation and may occupy this thread until the query succeeds or fails.
ResultSet resultSet = sqlExecutor.getResultSet();
while (resultSet.hasNext()) {
printRecord(resultSet.next());
}
}
} catch (OdpsException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (sqlExecutor ! = null) {
// Close the executor and release related resources.
sqlExecutor.close();
}
}
}
// SQLExecutor can be reused by pool mode
public static void ExampleWithPool() {
// Specify the account and project information.
Account account = new AliyunAccount("your_access_id", "your_access_key");
Odps odps = new Odps(account);
odps.setDefaultProject("your_project_name");
odps.setEndpoint("http://service.<regionid>.maxcompute.aliyun.com/api");
// Run queries by using a connection pool.
SQLExecutorPool sqlExecutorPool = null;
SQLExecutor sqlExecutor = null;
try {
// Prepare a connection pool. Specify the connection pool size and the default execution mode.
SQLExecutorPoolBuilder builder = SQLExecutorPoolBuilder.builder();
builder.odps(odps)
.initPoolSize(1) // init pool executor number
.maxPoolSize(5) // max executors in pool
.executeMode(ExecuteMode.INTERACTIVE); // run in interactive mode
sqlExecutorPool = builder.build();
// Obtain an executor from the connection pool. If no executor can be obtained from the connection pool, you can add an executor. Make sure that the total number of executors does not exceed the upper limit.
sqlExecutor = sqlExecutorPool.getExecutor();
// Use the executor in the same way it is used in the preceding example.
sqlExecutor.run("select count(1) from test_table;", new HashMap<>());
System.out.println("InstanceId:" + sqlExecutor.getId());
System.out.println("Logview:" + sqlExecutor.getLogView());
List<Record> records = sqlExecutor.getResult();
printRecords(records);
} catch (OdpsException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
sqlExecutor.close();
}
sqlExecutorPool.close();
}
private static void printRecord(Record record) {
for (int k = 0; k < record.getColumnCount(); k++) {
if (k ! = 0) {
System.out.print("\t");
}
if (record.getColumns()[k].getType().equals(OdpsType.STRING)) {
System.out.print(record.getString(k));
} else if (record.getColumns()[k].getType().equals(OdpsType.BIGINT)) {
System.out.print(record.getBigint(k));
} else {
System.out.print(record.get(k));
}
}
}
private static void printRecords(List<Record> records) {
for (Record record : records) {
printRecord(record);
System.out.println();
}
}
public static void main(String args[]) {
SimpleExample();
ExampleWithPool();
}
}