By default, the Spark SQL mode to submit a job is Yarn mode.
Log on to
Alibaba Cloud E-MapReduce Console Job List.
Click Create a job in the upper right corner to enter the job creation page.
Enter the job name.
Select the Spark SQL job type to create a Spark SQL job. This type of job is submitted in the background by using the following process:
spark-sql [options] [cli option]
Enter the Parameters in the option box with parameters subsequent to Spark SQL commands.
Directly write running SQL for -e options by inputting it into the Parameters box of the job, for example:
-e "show databases;"
-f options can be used to specify a Spark SQL script file. Loading well prepared Spark SQL script files on OSS can give more flexibility. We recommend that you use this operation mode, for example:
Select the policy for failed operations.
Click OK to complete Spark SQL job configuration.