All Products
Search
Document Center

AnalyticDB for MySQL:Spark editor

Last Updated:Nov 09, 2023

This topic describes how to create and run Spark applications in the AnalyticDB for MySQL console.

Overview

  • You can use the Spark editor to create and run Spark batch or streaming applications.

  • You can view the driver log and submission details of the current Spark application.

  • You can view the execution logs of SQL statements.

Prerequisites

Create and run a Spark application

  1. Log on to the AnalyticDB for MySQL console.
  2. In the upper-left corner of the page, select a region.
  3. In the left-side navigation pane, click Clusters.
  4. On the Data Lakehouse Edition (V3.0) tab, find the cluster that you want to manage and click the Cluster ID.
  5. In the left-side navigation pane, choose Job Development > Spark JAR Development.

  6. On the Spark JAR Development page, click the 1 icon to the right of Applications.

  7. In the Create Application Template dialog box, configure the parameters that are described in the following table.

    Parameter

    Description

    Name

    The name of the application or directory. File names are case-insensitive.

    Type

    • If you select Application from the Type drop-down list, the template is in the file format.

    • If you select Directory from the Type drop-down list, the template is in the folder format.

    Parent Level

    The parent directory of the file or folder.

    Job Type

    • Batch: batch application.

    • Streaming: streaming application.

  8. Click OK.

  9. After you create a Spark template, configure a Spark application in the Spark editor. For more information, see Overview of Spark application development.

  10. After you configure the Spark application, perform the following operations:

    • Click Save to save the Spark application. Then, you can reuse the application.

    • Click Run Now to run the Spark application. The status of the Spark application is displayed on the Applications tab in real time.

      Note

      Before you run a Spark application, you must select a job resource group and an application type.

View information about a Spark application

  1. On the Applications tab, search for an application by application ID and perform the following operations to view information about the Spark application:

    • Click Log in the Actions column to view the driver log of the current Spark application or the execution log of SQL statements.

    • Click UI in the Actions column to go to the corresponding Spark UI. Access to the UI has a validity period. If the validity period ends, you must re-access the UI.

    • Click Details in the Actions column to view submission details of the current application, such as the log path, web UI URL, cluster ID, and resource group name.

    • Choose More > Stop in the Actions column to stop the current application.

    • Choose More > History in the Actions column to view the history of retry attempts on the current application.

  2. On the Execution History tab, view the history of retry attempts on all applications.

    Note

    By default, no retry is performed after an application fails. To perform retry attempts, configure the spark.adb.maxAttempts and spark.adb.attemptFailuresValidityInterval parameters. For more information, see Configuration parameters of Spark applications.