All Products
Search
Document Center

AnalyticDB for MySQL:SQL editor

Last Updated:Jul 07, 2023

AnalyticDB for MySQL Data Lakehouse Edition (V3.0) provides a built-in SQL editor. You can perform SQL development in the AnalyticDB for MySQL console. This topic describes the features of the SQL editor.

Usage notes

  • If you log on to the AnalyticDB for MySQL console as a Resource Access Management (RAM) user to perform SQL development, you must associate a standard database account of AnalyticDB for MySQL with the RAM user. For more information, see Associate or disassociate a database account with or from a RAM user.

  • By default, privileged database accounts are associated with Alibaba Cloud accounts. You can use privileged database accounts to perform SQL development directly.

Go to the SQL editor

  1. Log on to the AnalyticDB for MySQL console. In the upper-left corner of the console, select a region. In the left-side navigation pane, click Clusters. On the Data Lakehouse Edition (V3.0) tab, find the cluster that you want to manage and click the cluster ID.

  2. In the left-side navigation pane, choose Job Development > SQL Development to go to the SQL editor.

Load a built-in dataset

If you did not select to load a built-in dataset when you created an AnalyticDB for MySQL Data Lakehouse Edition (V3.0) cluster, you can click Load Built-in Dataset on the SQL Development page to meet your test requirements. In the message that appears, click OK.

For more information about the built-in dataset, see Manage a built-in dataset.

View the schemas of databases and tables

On the Databases and Tables tab of the SQL Development page, you can view the system database INFORMATION_SCHEMA, the tables in this database, automatically created databases and tables (such as the databases and tables that are ingested to the AnalyticDB for MySQL cluster in real time), and manually created databases and tables. You can select options from the drop-down list to switch between databases.

Select an engine and a resource group

You can select different engines and resource groups to implement different SQL development methods. The following table describes the SQL development methods.

Engine

Resource group

Supported SQL development method

SQL execution mode

Spark

Job resource group

Spark SQL

Batch mode and interactive mode. For more information, see Spark SQL execution modes.

XIHE

Job resource group

XIHE BSP SQL

Batch mode.

Interactive resource group

XIHE MPP SQL

Interactive mode.

Important

XIHE BSP SQL and XIHE MPP SQL cannot be used to write Hudi tables.

View and export query results

After an SQL statement is executed, the query result or failure message is displayed on the Execution Results tab. The execution information about the SQL statement such as the execution status, resource group, database account, and execution duration is displayed on the Execution Records tab.

You can export the query results of XIHE BSP SQL to your on-premises device.

To configure the number of rows for exported results, click ... next to Execute(F8)(LIMIT 1000). If the number of rows that you specified in the LIMIT clause of the SQL statement is different from the number of rows that you selected in the console, the smaller one prevails.

To export the query results, click Download in the upper-right corner of the Execution Results tab.

Save SQL scripts

You can save SQL statements in the SQLConsole console as scripts, and then view the SQL statements on the Scripts tab.

Diagnose query results

You can diagnose SQL statements for XIHE BSP SQL and XIHE MPP SQL. On the Execution Records tab, click Diagnose in the Actions column corresponding to an SQL statement to diagnose the execution plan of the SQL statement. For more information about SQL diagnostics, see Overview of SQL diagnostics.

View Spark logs

After a Spark SQL statement is executed, you can click Log in the Actions column corresponding to an SQL statement on the Execution Records tab to view Spark logs.

Access the Spark web UI

After a Spark SQL statement is executed, you can click UI in the Actions column corresponding to an SQL statement on the Execution Records tab to access the Spark web UI and view the execution of Spark jobs.