In this tutorial, you will learn how to configure a Shell job.
Notice By default, Shell scripts are currently run by Hadoop. If you need to use the root user, sudo can be used. Use Shell script jobs with caution.
- Log on to the Alibaba Cloud E-MapReduce console.
- At the top of the navigation bar, click Data Platform.
- In the Actions column, click Design Workflow next to the specified project.
- On the left of the Job Editing page, right-click the folder you want to operate and select New Job.
- In the New Job dialog box, enter the job name and description.
- Select the Shell job type to create a Bash Shell job.
- Click OK.
Note You can also create subfolders, rename folders, and delete folders by right-clicking on them.
- Enter the parameters in the Content field after the Shell commands.
- -c option
-c options can be used to set Shell scripts to run by inputting them into the Content field of the job. For example:
-c "echo 1; sleep 2; echo 2; sleep 4; echo 3; sleep 8; echo 4; sleep 16; echo 5; sleep 32; echo 6; sleep 64; echo 8; sleep 128; echo finished"
- -f option
-f options can be used to run Shell script files. By uploading a Shell script file to OSS, Shell scripts on OSS can be defined in the job parameters, making it more flexible than the -c option. For example:
- -c option
- Click Save to complete Shell job configurations.