Serverless Spark provides various built-in parameters. This topic describes these parameters and their use cases to help you configure the runtime environment and optimize task execution.
Parameter | Description | Scenario |
spark.emr.serverless.user.defined.jars | Adds uploaded JAR packages to the ClassPath of the Serverless Spark driver and executors.
| Use this parameter to add custom JAR packages from OSS to the Spark driver and executors when you submit Spark tasks using the Spark-Submit tool, batch jobs, or Airflow Serverless Spark Operator, or when you create session resources. |
spark.emr.serverless.fusion | Specifies whether to enable Fusion for sessions or batch processing tasks started by Kyuubi and Livy. Valid values:
| You can use the Spark Configuration parameter in a task or session to enable Fusion. |
spark.emr.serverless.environmentId | Specifies the ID of the runtime environment to use for computing resources. | Use this parameter to specify a runtime environment when you submit Serverless Spark tasks using Airflow or the Spark-Submit tool. By default, third-party dependency libraries are installed in the runtime environment. |
spark.emr.serverless.network.service.name | Specifies the name of the network connection to enable network connectivity between computing resources and data sources in other VPCs. | Use this parameter to add a network connection when you submit a Serverless Spark task, allowing access to data sources in other Virtual Private Clouds (VPCs). |
spark.emr.serverless.excludedModules | Removes built-in libraries from Serverless Spark.
| This parameter is typically used when you need to use custom JAR packages. It lets you remove built-in Serverless Spark libraries when you submit Spark tasks from the Serverless Spark console, the Spark-Submit tool, batch jobs, Airflow Serverless Spark Operator, Kyuubi, or Livy, or when you create session resources. |
spark.emr.serverless.kyuubi.engine.queue | Specifies the name of the workspace queue where the Spark application started by Kyuubi will run. | This parameter can be set in the Kyuubi configuration section or specified in the JDBC URL when you establish a connection. |
spark.emr.serverless.templateId | Specifies the ID of the default configuration template for the Spark application. By referencing a predefined workspace template, you can simplify parameter configuration when you submit a task. You can obtain the template ID on the page. For example, | This parameter is supported only by the Spark-Submit tool. |
spark.emr.serverless.livy.config.mode | Controls whether to use the settings from the
| Set this parameter to |
spark.emr.serverless.tag.xxxx | You can add tags to a batch job submitted through Livy in the | Use this parameter to add tags to Spark jobs submitted through the Livy Gateway. You can then filter jobs by these tags in the job history. |