Shell job configuration
Notice By default, Shell scripts are currently run by Hadoop. If it is required to use root user, sudo can be used. Use Shell script jobs with caution.
- Log on to the Alibaba Cloud E-MapReduce Console to enter the Cluster List page with primary account.
- Click the Data Platform tab on the top to enter the Project List page.
- Click Design Workflow of the specified project in the Operation column.
- On the left side of the Job Editing page, right-click on the folder you want to operate and select New Job.
- In the New Job dialog box, enter the job name, job description.
Once the job type is selected, it cannot be modified.
- Select the Shell job type to create a Bash Shell job.
- Click OK.
Note You can also create subfolder, rename folder, and delete folder by right-clicking on the folder.
- Enter the parameters in theContent box with parameters subsequent to Shell commands.
- -c option
-c option can be used to set Shell scripts to run by inputting it into the Content box of the job, for example:
-c "echo 1; sleep 2; echo 2; sleep 4; echo 3; sleep 8; echo 4; sleep 16; echo 5; sleep 32; echo 6; sleep 64; echo 8; sleep 128; echo finished"
- -f option
-f option can be used to run Shell script files. By uploading a Shell script file to OSS, Shell scripts on OSS can be directly defined in the job parameters. This is more flexible than the -c option, for example:
- -c option
- Click Save to complete Shell job configuration.