SchedulerX allows you to create workflows to orchestrate multiple jobs. You can also pass data from an upstream job to a downstream job in a workflow. This reduces the complexity of your business logic. In this topic, three jobs are used as an example to describe how to pass data from an upstream job to a downstream job in a workflow.

Background information

You can pass data only between simple Java jobs. To pass data between distributed Java jobs, use the MapReduce model. For more information, see MapReduce model.

Procedure

  1. Implement the job scheduling classes JobProcessor A, JobProcessor B, and JobProcessor C for the three applications, respectively.
    • JobProcessor A

      @Component
      public class TestSimpleJobA extends JavaProcessor {
      @Override
      public ProcessResult process(JobContext context) throws Exception {
      System.out.println("TestSimpleJobA " + DateTime.now().toString("yyyy-MM-dd HH:mm:ss"));
      return new ProcessResult(true, String.valueOf(1));
      }
      }                            
    • JobProcessor B

      @Component
      public class TestSimpleJobB extends JavaProcessor {
      @Override
      public ProcessResult process(JobContext context) throws Exception {
      System.out.println("TestSimpleJobB " + DateTime.now().toString("yyyy-MM-dd HH:mm:ss"));
      return new ProcessResult(true, String.valueOf(2));
      }
      }                            
    • JobProcessor C

      @Component
      public class TestSimpleJobC extends JavaProcessor {
      @Override
      public ProcessResult process(JobContext context) throws Exception {
      List<JobInstanceData> upstreamDatas = context.getUpstreamData();
      int sum = 0;
      for (JobInstanceData jobInstanceData : upstreamDatas) {
            System.out.println("jobName=" + jobInstanceData.getJobName() 
                + ", data=" + jobInstanceData.getData());
            sum += Integer.valueOf(jobInstanceData.getData());
      }
      System.out.println("TestSimpleJobC sum=" + sum);
      return new ProcessResult(true, String.valueOf(sum));
      }
      }                            
  2. Deploy the applications in Enterprise Distributed Application Service (EDAS).
  3. Create a group for each application and create jobA, jobB, and jobC. For more information, see Create an application and Create a job.
  4. Create a workflow and import the jobs to the workflow. For more information, see Create a workflow.
  5. On the Process Management page, click More in the Operation column of the workflow and select Run once.

Result

Return to the Workflow details page, right-click jobA and select Details to view the job instance details of jobA. Repeat the step to view the job instance details of jobB, and jobC.

The Result or error field of jobA displays 1, which is the same as that of JobProcessor A.

The execution result of jobB is 2, which is the same as that of JobProcessor B. The execution result of jobC is 3 (1+2), which is the same as that of JobProcessor C. This indicates that the execution results of jobA and jobB are passed to job C.

You can view the following information in the console:

jobName=jobB, data=2
jobName=jobA, data=1
TestSimpleJobC sum=3