In this tutorial, you will learn how to configure a Shell job.

Notice By default, Shell scripts are currently run by Hadoop. If you need to use the root user, the sudo command can be used. Use Shell script jobs with caution.

Procedure

  1. Log on to the Alibaba Cloud E-MapReduce console.
  2. At the top of the navigation bar, click Data Platform.
  3. In the Projects area, select a target project ID to go to the Project Management tab page.
  4. In the left-side navigation bar, click Edit Jobs next to the specified project.
  5. On the left of the Edit Jobs tab page, right-click the folder you want to operate and select New Job.
  6. In the New Job dialog box, enter the job name and description.
  7. Select the Shell job type to create a Bash Shell job.
  8. Click OK.
    Note You can also create subfolders, rename folders, and delete folders by right-clicking on them.
  9. Enter the parameters in the Content field after the Shell commands.
    • -c option
      -c options can be used to set Shell scripts to run by inputting them into the Content field of the job. For example:
      -c "echo 1; sleep 2; echo 2; sleep 4; echo 3; sleep 8; echo 4; sleep 16; echo 5; sleep 32; echo 6; sleep 64; echo 8; sleep 128; echo finished"
    • -f option
      -f options can be used to run Shell script files. By uploading a Shell script file to OSS, Shell scripts on OSS can be defined in the job parameters, making it more flexible than the -c option. For example:
      -f ossref://mxbucket/sample/sample-shell-job.sh
  10. Click Save to complete Shell job configurations.