Dify workflows do not natively support time-based scheduling. XXL-JOB on Microservices Engine (MSE) fills this gap -- schedule Dify workflows on a recurring basis, monitor execution status, and get alerts on failures or timeouts.
Use cases
Scheduled Dify workflows fit any scenario where a large language model (LLM) task needs to run on a recurring basis:
Risk monitoring: Scan risk data every minute, analyze potential threats with an LLM, and trigger alerts in real time.
Data analysis: Pull financial data daily, generate LLM-driven insights, and deliver them to stakeholders.
Content generation: Summarize the day's work and produce daily reports automatically.
Scheduling options
Both self-managed Dify workflows over the Internet and Alibaba Cloud Dify workflows over a Virtual Private Cloud (VPC) are supported.
Choose the schedule type that best fits your use case:
| Schedule type | Best for | Example |
|---|---|---|
cron | Calendar-based recurring tasks | 0 0 8 * * ? (every day at 08:00) |
fixed_rate | Consistent intervals, regardless of how long each run takes | Every 5 minutes |
fixed_delay | Intervals measured from the end of the previous run | 10 seconds after each run completes |
one_time | A single future execution | Run once at 2026-03-15 14:00 |
api | On-demand triggering from an external system | Triggered by a CI/CD pipeline or monitoring system |
Custom time zones and custom calendars are also supported.
Alerting and monitoring
Configure alert policies at the job, instance, or application level:
Failure alerts -- triggered when a job execution fails.
Timeout alerts -- triggered when execution exceeds the expected duration.
Success notifications -- sent after successful completion.
Threshold-triggered alerts -- based on custom metrics at the instance or application level.
Notifications can be delivered by text message, phone call, webhook, or email.
Observability
Scheduling dashboard: View instance-level and application-level scheduling curves for scheduled, successful, and failed executions.
Execution history: Review the status, basic information, inputs, outputs, time consumption, and token consumption of each Dify workflow run.
Scheduling events: Inspect workflow-related and node-related events for each scheduled execution.
Node tracking: Trace the execution result of every node in a single workflow run. Drill-down is supported for loop nodes, iteration nodes, and conditional branch nodes.
Prerequisites
Before you begin, make sure that you have:
A deployed Dify service. For more information, see Install ack-dify
An MSE XXL-JOB instance with an engine version later than 2.2.0
An application created on the instance
Step 1: (Optional) Configure an internal endpoint for the Dify API server
If your Dify service runs on Alibaba Cloud, create an internal Server Load Balancer (SLB) endpoint so that XXL-JOB can reach the Dify API server over the VPC instead of the public Internet.
Log on to the Container Service for Kubernetes (ACK) console.
In the left-side navigation pane of the Clusters page, click the name of the cluster where Dify is deployed.
In the left-side navigation pane, choose Network > Services, and then click Create.
In the Create Service dialog box, configure the following settings: Configure the remaining parameters as needed, and then click OK.
Set Service Type to SLB.
Set Access Method to Internal Access.
In Backend, enter
componentandproxy.


Verify that the internal endpoint is generated.

Step 2: Create a scheduled Dify workflow task
This step walks you through preparing a Dify workflow and then creating a scheduled task in XXL-JOB to run it.
2a. Prepare a Dify workflow
Log on to the Dify console, click Studio in the upper-right corner, and select Create from Template to create a workflow.
2b. Create a task in XXL-JOB
Log on to the MSE console and select a region in the top navigation bar.
In the left-side navigation pane, choose Task Scheduling > XXL-JOB Version.
Click the ID of the target instance. In the left-side navigation pane, click Task Management, and then click Create Task.
In the Create Task panel, set Job Type to Dify Workflow. For details on other task parameters, see Task management.

Configure the Dify-specific parameters: Example input:
Parameter Description Where to find it Endpoint The endpoint of the Dify API server. If your Dify service runs on Alibaba Cloud, use the internal endpoint from Step 1. In the Dify console, go to the API Access page. The endpoint is displayed in the upper-right corner. API Key The API key for the target workflow. Each workflow has its own key. On the API Access page, click API Key in the upper-right corner. Input The workflow input in JSON format. This value corresponds to the inputsfield in the request body.Refer to the workflow's input schema. {"input_text": "what is your name"}
Step 3: Verify the result
After you create the task, run it once manually to confirm that the Dify API connection, input parameters, and scheduling configuration all work as expected before automatic scheduling begins.
On the Task Management page, find the task and click Run once in the Operation column.

Click More in the Operation column and select Scheduling Records to view execution records.

Click details in the Operation column of an execution record to inspect the workflow run. Three tabs are available:
Basic information -- displays execution status, timing, and metadata.

Input and Output -- shows the JSON input sent to the workflow and the output returned.

Track -- traces every node in the workflow execution. If iteration nodes, loop nodes, or branch nodes are involved, click a node to drill down into its sub-executions.
