All Products
Search
Document Center

Spark UI

Last Updated: Nov 18, 2020

You can view the information about a Spark job on Apache Spark web UI when the job is running or succeeds.

Procedure

  1. Log on to the DLA console.

  2. In the top navigation bar, select the region where DLA is deployed.

  3. In the left-side navigation pane, choose Serverless Spark > Submit job.

  4. In the job list, find your Spark job and click SparkUI in the Operation column.

    SparkUI

  5. Use a browser to access Apache Spark web UI.

    web ui

Precautions

By default, the historical information about a Spark job is not saved. You can view information about a Spark job on Apache Spark web UI only when the job is running. To access Apache Spark web UI after a Spark job succeeds, you only need to add a configuration item when you submit the job. This configuration item is used to save the historical information about the job to your OSS. The method to view the information about historical jobs on Apache Spark web UI is the same as the method to view the information about a running Spark job.

Configuration item:

spark.dla.job.log.oss.uri":"oss://{bucket_name}/{dir_name}
Note
  • bucket_name: the bucket to which the Spark job belongs. The bucket is located in the same region as the Spark job.

  • dir_name: the name of the directory. Multi-level directories are supported.

Example:

{
    "name": "SparkPi",
    "file": "local:///tmp/spark-examples.jar",
    "className": "org.apache.spark.examples.SparkPi",
    "conf": {
        "spark.dla.job.log.oss.uri": "oss://your-bucket/your-logs-dir"
    }
}