SparkPi is known to Spark developers and is equivalent to a "Hello World" program of Apache Spark. This topic describes how to submit a SparkPi job in the Data Lake Analytics (DLA) console.

Prerequisites

  1. A virtual cluster (VC) is created before a SparkPi job is submitted. For more information about how to create a VC, see Create a virtual cluster.
    Note When you create a VC, you must set Engine to Spark.
  2. Your RAM user is granted the permissions to submit SparkPi jobs. This operation is required only when you log on to DLA as a RAM user. For more information, see Grant permissions to a RAM user (simplified version).

Procedure

  1. Log on to the DLA console.
  2. In the top navigation bar of the Overview page, select the region where the VC resides.
  3. In the left-side navigation pane, choose Serverless Spark > Submit job.
  4. On the Parameter Configuration page, click Create Job. In the Create Job dialog box, configure the parameters shown in the following figure and click OK.
  5. On the Parameter Configuration page, retain the default configuration of the SparkPi job in the code editor. Then, click Execute. Submit a job