All Products
Search
Document Center

Quick start for console

Last Updated: Nov 02, 2018

This section describes how to use the console to submit a job. The job aims to count the number of times INFO, WARN, ERROR, and DEBUG appear in a log file.

Note: Make sure that you have signed up Batch Compute service in advance.

Contents:

  1. Prepare a job.

    1.1. Upload the data file to the OSS.
    1.2. Upload the task program to the OSS.

  2. Use the console to submit the job.

  3. Check the job status.

  4. Check the result.

1. Prepare a job

The job aims to count the number of times INFO, WARN, ERROR, and DEBUG appear in a log file.

This job contains the following tasks:

  • The split task is used to divide the log file into three parts.
  • The count task is used to count the number of times INFO, WARN, ERROR, and DEBUG appear in each part of the log file. In the count task, InstanceCount must be set to 3, indicating that three count tasks are started concurrently.
  • The merge task is used to merge all the count results.

DAG:

DAG

1.1 Upload data file to OSS

Download the data file used in this example: log-count-data.txt

Upload the log-count-data.txt file to:

  1. oss://your-bucket/log-count/log-count-data.txt
  • your-bucket indicates the bucket created by yourself. In this example, it is assumed that the region is cn-shenzhen.
  • For more information about how to upload data to the OSS, see Upload files to OSS and Common OSS tools.

1.2 Upload task program to OSS

The job program used in this example is complied using Python. Download the program log-count.tar.gz.

In this example, it is unnecessary to modify the sample codes. You can directly upload log-count.tar.gz to the OSS, for example:

oss://your-bucket/log-count/log-count.tar.gz.

The upload method has been described earlier.

  • Batch Compute supports only the compressed packages with the extension tar.gz. Make sure that you use the preceding method (gzip) for packaging; otherwise, the package cannot be parsed.
  • If you must modify codes, decompress the file, modify the codes, and then follow these steps to pack the modified codes:

The command is as follows:

  1. > cd log-count # Switch to the directory.
  2. > tar -czf log-count.tar.gz * # Pack all files under this directory to log-count.tar.gz.

You can run the following command to check the content of the compressed package:

  1. $ tar -tvf log-count.tar.gz

The following list is displayed:

  1. conf.py
  2. count.py
  3. merge.py
  4. split.py

2. Use console to submit job

  1. Log on to the Batch Compute console.

  2. Choose Job List > Submit Job, and submit the job. Select an appropriate region, which must be the same as the region of the bucket.

    Here, AutoCluster is used to submit a job. For AutoCluster, you must configure at least two parameters, including:

    • Available image ID. You can use the image provided by the system or custom an image. For more information about how to custom an image, see Use an image.

    • InstanceType. For more information about the instance type, see Currently supported instance types.

    To run this example, you also need to change PackagePath (OSS directory to which the job is packed and uploaded. It is oss://your-bucket/log-count/log-count.tar.gz in this example),

    StdoutRedirectPath, and StderrRedirectPath (output address of task results and errors) to the corresponding OSS directory (oss://your-bucket/log-count/logs/ in this example).

    The following shows the JSON template of the job. For more information about parameters, click here.

    1. {
    2. "DAG": {
    3. "Dependencies": {
    4. "split": [
    5. "count"
    6. ],
    7. "count": [
    8. "merge"
    9. ],
    10. "merge": []
    11. },
    12. "Tasks": {
    13. "split": {
    14. "InstanceCount": 1,
    15. "LogMapping": {},
    16. "AutoCluster": {
    17. "Configs": {
    18. "Networks": {
    19. "VPC": {
    20. "CidrBlock": "192.168.0.0/16"
    21. }
    22. }
    23. },
    24. "ResourceType": "OnDemand",
    25. "InstanceType": "ecs.sn1ne.large",
    26. "ImageId": "img-ubuntu-vpc"
    27. },
    28. "Parameters": {
    29. "Command": {
    30. "EnvVars": {},
    31. "CommandLine": "python split.py",
    32. "PackagePath": "oss://your-bucket/log-count/log-count.tar.gz"
    33. },
    34. "InputMappingConfig": {
    35. "Lock": true
    36. },
    37. "StdoutRedirectPath": "oss://your-bucket/log-count/logs/",
    38. "StderrRedirectPath": "oss://your-bucket/log-count/logs/"
    39. },
    40. "InputMapping": {
    41. "oss://your-bucket/log-count/": "/home/input/"
    42. },
    43. "OutputMapping": {
    44. "/home/output/": "oss://your-bucket/log-count/"
    45. },
    46. "MaxRetryCount": 0,
    47. "Timeout": 21600,
    48. "ClusterId": ""
    49. },
    50. "merge": {
    51. "InstanceCount": 1,
    52. "LogMapping": {},
    53. "AutoCluster": {
    54. "Configs": {
    55. "Networks": {
    56. "VPC": {
    57. "CidrBlock": "192.168.0.0/16"
    58. }
    59. }
    60. },
    61. "ResourceType": "OnDemand",
    62. "InstanceType": "ecs.sn1ne.large",
    63. "ImageId": "img-ubuntu-vpc"
    64. },
    65. "Parameters": {
    66. "Command": {
    67. "EnvVars": {},
    68. "CommandLine": "python merge.py",
    69. "PackagePath": "oss://your-bucket/log-count/log-count.tar.gz"
    70. },
    71. "InputMappingConfig": {
    72. "Lock": true
    73. },
    74. "StdoutRedirectPath": "oss://your-bucket/log-count/logs/",
    75. "StderrRedirectPath": "oss://your-bucket/log-count/logs/"
    76. },
    77. "InputMapping": {
    78. "oss://your-bucket/log-count/": "/home/input/"
    79. },
    80. "OutputMapping": {
    81. "/home/output/": "oss://your-bucket/log-count/"
    82. },
    83. "MaxRetryCount": 0,
    84. "Timeout": 21600,
    85. "ClusterId": ""
    86. },
    87. "count": {
    88. "InstanceCount": 3,
    89. "LogMapping": {},
    90. "AutoCluster": {
    91. "Configs": {
    92. "Networks": {
    93. "VPC": {
    94. "CidrBlock": "192.168.0.0/16"
    95. }
    96. }
    97. },
    98. "ResourceType": "OnDemand",
    99. "InstanceType": "ecs.sn1ne.large",
    100. "ImageId": "img-ubuntu-vpc"
    101. },
    102. "Parameters": {
    103. "Command": {
    104. "EnvVars": {},
    105. "CommandLine": "python count.py",
    106. "PackagePath": "oss://your-bucket/log-count/log-count.tar.gz"
    107. },
    108. "InputMappingConfig": {
    109. "Lock": true
    110. },
    111. "StdoutRedirectPath": "oss://your-bucket/log-count/logs/",
    112. "StderrRedirectPath": "oss://your-bucket/log-count/logs/"
    113. },
    114. "InputMapping": {
    115. "oss://your-bucket/log-count/": "/home/input/"
    116. },
    117. "OutputMapping": {
    118. "/home/output/": "oss://your-bucket/log-count/"
    119. },
    120. "MaxRetryCount": 0,
    121. "Timeout": 21600,
    122. "ClusterId": ""
    123. }
    124. }
    125. },
    126. "Description": "batchcompute job",
    127. "Priority": 0,
    128. "JobFailOnInstanceFail": true,
    129. "Type": "DAG",
    130. "Name": "log-count"
    131. }
    • Check that all parameters and directories are correct, click Submit Job in the lower left corner, and then click OK.

3. Check job status

  • Click the newly submitted job log-count in the job list to view the details of this job.

    Job details

  • Click the task name split to view the details of this task.

    task details

  • Click the green block to view the instance log.

    View the log

4. Check job execution result

You can log on to the OSS console and check the following file under your bucket: /log-count/merge_result.json.

The expected result is as follows:

  1. {"INFO": 2460, "WARN": 2448, "DEBUG": 2509, "ERROR": 2583}