Obtain the required SDK

You can obtain the latest SDK package for DLA from the PIP repository at the Official website of SDK for Python.

Use the SDK to submit a Spark job

  1. Obtain the AccessKey pair. For more information, see Obtain an AccessKey pair.
  2. Obtain the ID of the region to which the current zone belongs. For more information about the region ID of each zone, see Regions and zones.
  3. Determine the name of the virtual cluster (VC) and JSON content for executing the Spark job. You can submit the Spark job in the DLA console.
    Sample code for using an SDK to submit a Spark job:
    def submit_spark_job(region: str, access_key_id: str, 
                         access_key_secret: str, cluster_name: str, job_config: str):
        Submit a Spark job and return the ID of the job.
        :param region:             The ID of the region where the Spark job is submitted.
        :param access_key_id:      The AccessKey ID.
        :param access_key_secret:  The AccessKey secret.
        :param cluster_name:       The name of the Spark VC where the job is executed.
        :param job_config:         The JSON string of the Spark job.
        :return:                   The ID of the Spark job.
        :rtype:                    basestring
        :exception                 ClientException
        # Create a client.
        client = AcsClient(ak=access_key_id, secret=access_key_secret, region_id=region)
        # Initialize the request content.
        request = SubmitSparkJobRequest.SubmitSparkJobRequest()
        # Submit the job and obtain the result.
        response = client.do_action_with_exception(request)
        # Return the job ID.
        r = json.loads(str(response, encoding='utf-8'))
        return r['JobId']
    Notice JobConfig is a valid JSON string. We recommend that you run a small number of jobs in the DLA console, and then use the SDK to automate the submission of core services.