EMR Serverless Spark supports spark-submit command-line parameters, letting you submit Spark batch jobs directly from the console without writing SDK code. This guide walks you through the full workflow: preparing a JAR package, uploading it to Object Storage Service (OSS), running a Spark Submit task, and publishing it for scheduling.
Prerequisites
Before you begin, ensure that you have:
A workspace. See Workspace Management for setup instructions.
A resource queue configured in your workspace. See Manage resource queues.
A JAR package ready to submit (Step 1 provides a sample JAR if you don't have one).
Step 1: Get a JAR package
This guide uses a sample JAR that calculates the value of Pi (π) to verify the end-to-end workflow.
Download the JAR that matches your database engine version:
| Database engine version | JAR package |
|---|---|
| esr-4.x | spark-examples_2.12-3.5.2.jar |
| esr-5.x | spark-examples_2.13-4.0.1.jar |
If you already have your own JAR package, skip this step and use that file in the steps below.
Step 2: Upload the JAR package to OSS
Upload the JAR to an OSS bucket that your workspace can access. For upload instructions, see Simple upload.
After the upload completes, note the full OSS path (for example, oss://<your-bucket>/spark-examples_2.12-3.5.2.jar). You will use this path in the next step.
Step 3: Create and run a Spark Submit task
On the EMR Serverless Spark page, click Data Development in the navigation pane on the left.
On the Development tab, click the
icon to create a new task.Enter a name for the task. Set Type to Batch Job > Spark Submit, then click OK.
In the upper-right corner, select a resource queue.
In the task editor, enter your Spark Submit script in the Script field. Leave other parameters unchanged. The script uses standard spark-submit syntax:
--class <main-class> # Entry point class (required) --conf <key>=<value> # Spark configuration override (optional, repeatable) oss://<your-bucket>/<your-jar-file> # Path to your JAR in OSS (required)For the sample JAR downloaded in Step 1, use:
--class org.apache.spark.examples.SparkPi \ --conf spark.executor.memory=2g \ oss://<YourBucket>/spark-examples_2.12-3.3.1.jarReplace
<YourBucket>with your OSS bucket name. If you downloaded the esr-5.x JAR, replace the filename accordingly.Click Run.
In the Execution Records section, wait for the task status to update.
Click Log Exploration in the Actions column to verify the task ran successfully. On the Log Exploration tab, confirm the task completed without errors.

Step 4: Publish the task
Published tasks can be added as nodes in workflows for scheduled execution.
After the task runs successfully, click Publish on the right.
In the dialog box, enter release notes and click OK.
(Optional) Step 5: View the Spark UI
View detailed execution metrics in the Spark UI after the task completes.
In the navigation pane on the left, click Job History.
On the Application page, find your task and click Spark UI in the Actions column. The Spark UI opens and displays execution details for the task.
What's next
Schedule the published task as part of a workflow. See Manage workflows.
Explore a complete task orchestration example. See Get started with SparkSQL development.