All Products
Search
Document Center

AnalyticDB:StartSparkSQLEngine

Last Updated:Aug 30, 2024

Starts the Spark SQL engine.

Operation description

  • Regional public endpoint: adb.<region-id>.aliyuncs.com. Example: adb.cn-hangzhou.aliyuncs.com.
  • Regional Virtual Private Cloud (VPC) endpoint: adb-vpc.<region-id>.aliyuncs.com. Example: adb-vpc.cn-hangzhou.aliyuncs.com.
Note If HTTP status code 409 is returned when you call this operation in the China (Qingdao), China (Shenzhen), China (Guangzhou), or China (Hong Kong) region, contact technical support.

Debugging

OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer.

Authorization information

The following table shows the authorization information corresponding to the API. The authorization information can be used in the Action policy element to grant a RAM user or RAM role the permissions to call this API operation. Description:

  • Operation: the value that you can use in the Action element to specify the operation on a resource.
  • Access level: the access level of each operation. The levels are read, write, and list.
  • Resource type: the type of the resource on which you can authorize the RAM user or the RAM role to perform the operation. Take note of the following items:
    • The required resource types are displayed in bold characters.
    • If the permissions cannot be granted at the resource level, All Resources is used in the Resource type column of the operation.
  • Condition Key: the condition key that is defined by the cloud service.
  • Associated operation: other operations that the RAM user or the RAM role must have permissions to perform to complete the operation. To complete the operation, the RAM user or the RAM role must have the permissions to perform the associated operations.
OperationAccess levelResource typeCondition keyAssociated operation
adb:StartSparkEnginecreate
  • All Resources
    *
    none
none

Request parameters

ParameterTypeRequiredDescriptionExample
DBClusterIdstringYes

The cluster ID.

amv-abcd****
ResourceGroupNamestringYes

The name of the resource group.

spark-rg-name
MinExecutorlongNo

The minimum number of executors that are required to execute SQL statements. Valid values: 0 to 2000. A value of 0 indicates that no executors are permanent if no SQL statements are executed. If this value exceeds the total number of executors that are supported by the resource group, the Spark SQL engine fails to be started. The value must be less than the value of MaxExecutor.

1
MaxExecutorlongNo

The maximum number of executors that are required to execute SQL statements. Valid values: 1 to 2000. If this value exceeds the total number of executes that are supported by the resource group, the Spark SQL engine fails to be started.

10
JarsstringNo

The Object Storage Service (OSS) paths of third-party JAR packages that are required to start the Spark SQL engine. Separate multiple OSS paths with commas (,).

oss://testBuckname/test.jar,oss://testBuckname/test2.jar
SlotNumlongNo

The maximum number of slots that are required to maintain Spark sessions for executing SQL statements. Valid values: 1 to 500.

100
ConfigstringNo

The configuration that is required to start the Spark SQL engine. Specify this value in the JSON format. For more information, see Conf configuration parameters.

{ "spark.shuffle.timeout": ":0s" }

Response parameters

ParameterTypeDescriptionExample
object

The response parameters.

RequestIdstring

The request ID.

D65A809F-34CE-4550-9BC1-0ED21ETG380
Dataobject

The returned data.

AppIdstring

The ID of the Spark job.

s202301xxxx
Statestring

The state of the Spark SQL engine. Valid values:

  • SUBMITTED
  • STARTING
  • RUNNING
  • FAILED
SUBMITTED
AppNamestring

The name of the Spark application.

SQLEngine1

Examples

Sample success responses

JSONformat

{
  "RequestId": "D65A809F-34CE-4550-9BC1-0ED21ETG380",
  "Data": {
    "AppId": "s202301xxxx",
    "State": "SUBMITTED",
    "AppName": "SQLEngine1"
  }
}

Error codes

HTTP status codeError codeError messageDescription
400Spark.InvalidParameterInvalid parameter value: %sThe specified parameter is invalid.
400Spark.App.ExceedQuotaLimitationThe requested resource exceeds the maximum limit: %s-
400Spark.App.InvalidAppTypeWhenSubmitThe specified AppType is INVALID or NULL. Please refer to the documentation to correct the parameters for %s.The AppType type is illegal, please refer to using the document to specify the AppType suitable for your business.
400Spark.App.InvalidResourceSpecThe requested resource type is not supported:\n %s-
400Spark.App.ParameterConflictConflicting parameters submitted:\n %s-
400Spark.App.ResourceNegativeErrorThe requested resource MUST be positive: %s-
400Spark.Config.InvalidAppFilePathThe main driver file MUST be a valid OSS path. Current is %s.-
400Spark.Config.invalidConnectorsThe spark.adb.connectors configuration is invalid: %s-
400Spark.Config.InvalidDiskConfigThe requested disk mount parameter is invalid: %s-
400Spark.Config.InvalidHostAliasConfigThe requested host alias parameter %s is invalid. Example: 127.0.0.1 localhost local-
400Spark.Config.InvalidLogOssPathThe OSS address for log storage is illegal: %s-
400Spark.Config.InvalidRoleArnFormatConfigure RoleARN %s invalid format. It should match acs:ram::uid_number:role/role_name-
400Spark.Config.InvalidTimeZoneUnable to parse time zone configuration %s : %s-
400Spark.Config.MainAppFileNotExistsThe main driver file is missing, [file] or [sqls] must be configured.-
400Spark.Config.OSSPathAccessErrorThe configured OSS address %s cannot be accessed.-
400Spark.Config.OSSPathNotExistsThe OSS address %s requested does not exist or the permissions are insufficient.-
400Spark.Config.RoleArnVerifyFailedRoleARN parameter verification failed. Error msg: %s when verify RoleArn %s-
400Spark.Config.SecurityGroupNotFoundThe security group in the configuration does not exist or cannot be accessed. %s.The configured resource group does not exist or cannot be accessed due to insufficient permissions. %s.
400Spark.Config.VswitchNotFoundThe vswitch in the configuration does not exist or cannot be accessed. %s.-
400Spark.InvalidStateThe object of the operation is in an invalid state: %sThe operation object is invalid.
400Spark.Log.InvalidStateFailed to obtain the logs of the Spark job %s in the %s state.-
403Spark.ForbiddenNo permissions to access the resources: %sInsufficient permissions to access the related resources. Information that you want to access: %s.
404Spark.App.ContentNotFoundThe requested content %s of the Spark application is not found.-
404Spark.App.NotFoundThe Spark application %s is not found.-
404Spark.ObjectNotFoundThe object is not found. More information: %s-
500Spark.ServerErrorThe Spark control component system encountered an error, please create a ticket to solve the problem or concat the supported engineer on duty. Error message: %sAn error occurred on the Spark control component system. Submit a ticket or contact technical support.

For a list of error codes, visit the Service error codes.

Change history

Change timeSummary of changesOperation
2023-07-27The Error code has changedView Change Details
2023-06-28The Error code has changedView Change Details