All Products
Search
Document Center

Quick start for cli 2

Last Updated: May 10, 2018

This section describes how to use the Batch Compute-cli tool to submit a job that counts the occurrences of “INFO”, “WARN”, “ERROR”, and “DEBUG” in a log file.

Note: Make sure that you have signed up Batch Compute service in advance.

Contents:

1. Install and configure Batch Compute-cli tool

Click here for the installation and configuration of the Batch Compute-cli tool.

2. Prepare a job

The job aims to count the occurrences of “INFO”, “WARN”, “ERROR”, and “DEBUG” in a log file.

This job contains the following tasks:

  • The split task is used to divide the log file into three parts.
  • The count task is used to count the number of times “INFO”, “WARN”, “ERROR”, and “DEBUG” appear in each part of the log file. In the count task, InstanceCount must be set to 3, indicating that three count tasks are started concurrently.
  • The merge task merges all the results of the count task.

DAG

DAG

2.1. Upload data file to the OSS

Download the data file used in this example: log-count-data.txt

Upload the log-count-data.txt file to:

  1. oss://your-bucket/log-count/log-count-data.txt
  • your-bucket indicates the bucket you created. In this example, the region is cn-shenzhen.
  1. bcs oss upload ./log-count-data.txt oss://your-bucket/log-count/log-count-data.txt
  2. bcs oss cat oss://your-bucket/log-count/log-count-data.txt # Check whether the file is uploaded successfully
  • The bcs oss command can complete some typical actions related to your OSS instance. bcs oss -h shows the help information about this command. We recommend that you use this command when only a few data is to be tested. In the case of a large amount of data, the upload or download takes a long time because multithreading is not implemented yet. For more information about how to upload data to OSS instances, see OSS tools.

2.2 Prepare task programs

The job program used in this example is complied using Python. Download the program: log-count.tar.gz.

Decompress the program package into the following directory:

  1. mkdir log-count && tar -xvf log-count.tar.gz -C log-count

After decompression, the log-count/ directory structure is as follows:

  1. log-count
  2. |-- conf.py # Configuration
  3. |-- split.py # split task program
  4. |-- count.py # count task program
  5. |-- merge.py # merge task program

Note: Do not change the task programs.

3. Submit job

3.1. Compile job configuration

In the parent directory of log-count, create a file: job.cfg (under the same parent directory as log-count). The file contains the following content:

  1. [DEFAULT]
  2. job_name=log-count
  3. description=demo
  4. pack=./log-count/
  5. deps=split->count;count->merge
  6. [split]
  7. cmd=python split.py
  8. [count]
  9. cmd=python count.py
  10. nodes=3
  11. [merge]
  12. cmd=python merge.py

The file describes a multi-task job, with tasks executed in the following sequence: split->count->merge.

3.2. Submit the job

  1. bcs sub --file job.cfg -r oss://your-bucket/log-count/:/home/input -w oss://your-bucket/log-count/:/home/output
  • In the command, -r and -w indicate read-only directory attaching and writable directory mapping respectively. For more information, see OSS directory attaching.
  • The same OSS path can be attached to different local directories,but different OSS paths cannot be attached to the same local directory.

4. Check job running status

  1. bcs j # Obtain the job list. The job list obtained each time is cached. Generally, the first job in the cache is the one you exactly submitted.
  2. bcs ch 1 # Check the status of the first job in the cache.
  3. bcs log 1 # Check the log of the first job in the cache.

5. Check job execution result

After the job is executed, run the following command to check the result on OSS.

  1. bcs oss cat oss://your-bucket/log-count/merge_result.json

The expected result is as follows:

  1. {"INFO": 2460, "WARN": 2448, "DEBUG": 2509, "ERROR": 2583}