All Products
Document Center

Data Lake Analytics:Apache Spark web UI

Last Updated:May 19, 2022

This topic describes how to view the information about a Spark job on Apache Spark web UI when the job is running or completed in Data Lake Analytics (DLA).


  1. Log on to the DLA console.

  2. In the top navigation bar, select the region where DLA resides.

  3. In the left-side navigation pane, choose Serverless Spark > Submit job.

  4. On the Parameter Configuration page, click the Job list tab, find your Spark job, and then select SparkUI from the Operation drop-down list.

  5. Use a browser to access Apache Spark web UI. The access method is the same as that provided by the Apache Spark community.

    web ui

Usage notes

1. Log directory

You can access Apache Spark web UI after your job is completed. The historical logs of the job are automatically uploaded to the OSS directory oss://aliyun-oa-query-results-<user-parent-id>-oss-<region-id>/Spark_Tmp/. If you want to customize the directory, you can use the following parameter to change the directory.

  • bucket_name: The name of the OSS bucket that is created and deployed in the same region as the serverless Spark engine of DLA.

  • dir_name: the name of the directory. Multi-level directories are supported.

Sample job configurations:

    "name": "SparkPi",
    "file": "local:///tmp/spark-examples.jar",
    "className": "org.apache.spark.examples.SparkPi",
    "conf": {
        "spark.dla.job.log.oss.uri": "oss://your-bucket/your-logs-dir"

2. Validity period of Apache Spark web UI

Apache Spark web UI occupies the network resources of the service platform. You can access Apache Spark web UI to view the logs of the historical jobs that are completed within the last three days. After the validity period expires, Apache Spark web UI cannot be accessed. However, log data on OSS is not deleted. You can query or delete log data on the OSS client based on your business requirements.