All Products
Search
Document Center

AnalyticDB:Use Azkaban to schedule XIHE SQL jobs

Last Updated:Mar 28, 2026

Azkaban is an open-source batch workflow job scheduler that can be used to create, execute, and manage workflows that contain complex dependencies. Use it to run XIHE SQL jobs against AnalyticDB for MySQL on a recurring schedule, with dependency tracking between jobs.

Prerequisites

Before you begin, ensure that you have:

Prepare test data

In your AnalyticDB for MySQL cluster, create a test database and table:

-- Create a database.
CREATE DATABASE azkaban_test;

-- Create a table.
CREATE TABLE azkaban_test.names (
  id BIGINT,
  name STRING
);

-- Insert data into the table.
INSERT INTO azkaban_test.names VALUES(1, 'Li'), (2, 'Yang');

Step 1: Write a workflow file

An Azkaban workflow consists of .job files, each defining a single job. Each job declares its type, the shell command to run, and which jobs it depends on. You package all .job files into a ZIP file for upload.

This example defines three jobs that run in sequence: startexec_queryend.

Job files

start.job — entry point of the workflow:

#start
type=command
command=echo 'job start'

exec_query.job — runs the SQL statement against AnalyticDB for MySQL after start completes:

type=command
dependencies=start
command=mysql -h<endpoint> -u<username> -p<password> -P3306 -e "create table azkaban_test.duplicated_names as select * from azkaban_test.names"

end.job — signals workflow completion after exec_query completes:

#end
type=command
dependencies=exec_query
command=echo 'job end'

The key fields in each .job file are:

FieldDescription
type=commandRuns the value of command as a shell command
dependenciesComma-separated list of job names that must complete before this job starts
commandThe shell command to run

The MySQL command in exec_query.job uses the following parameters:

ParameterDescription
-hEndpoint of the AnalyticDB for MySQL cluster. Find it on the Cluster Information page in the AnalyticDB for MySQL console.
-uDatabase account name for the cluster
-pPassword for the database account
-PPort number. Set to 3306.
-eSQL statement to run

Package the workflow

Create a folder, place all .job files in it, and compress the folder as a ZIP file.

Step 2: Create a project and upload the workflow file

  1. Log in to the Azkaban console. In the top navigation bar, click Projects.

  2. In the upper-right corner, click Create Project.

  3. In the Create Project dialog box, enter a Name and Description, then click Create Project.

  4. In the upper-right corner, click Upload.

  5. In the Upload Project Files dialog box, select the ZIP file you created, then click Upload.

Step 3: Execute the workflow

  1. On the Projects page, click the Flows tab.

  2. Click Execute Flow.

  3. In the dialog box, click Execute.

  4. In the Flow submitted notification, click Continue.

Step 4: View execution details

  1. In the top navigation bar, click Executing.

  2. Click the Recently Finished tab to see completed executions.

  3. Click an execution ID to open the execution detail page.

  4. On the Job List tab, review the status and duration of each job.

  5. Click Logs next to a job to view its full execution log.