All execution components simply allow to react to success and failure. In case of job success, a token is send to the first output port. In case of job failure, the token is send to the second output port.
All execution components allow to redirect the error token to the first output port. Use the Redirect error output attribute for uniform reaction to job success and failure.
Sequential jobs execution is performed by simple execution components chaining. Failure of any job causes jobflow termination.
Sequential jobs execution can be extended by common job failure handling. The TokenGather component is suitable for gathering all tokens representing job failures.
Parallel jobs execution is simply allowed by a set of independent executors. Reaction to success and failure is available for each individual job.
The Barrier component allows to react to success or failure of parallel running jobs. By default, a group of parallel running jobs is considered successful if all jobs finished successfully. Barrier has various settings to satisfy all your needs in this manner.
Conditional processing is allowed by token routing.
Based on results of Job
you can decide using the Condition component
which branch of processing will be used afterwards.
A parent jobflow can pass some data to a child job using input dictionary entries. These job parameters can be read by the GetJobInput component and can be used in further processing. On the other side, jobflow results can be written to output dictionary entries using the SetJobOutput component. These results are available in the parent jobflow.
You can intentionally stop processing of a jobflow using the Fail component. The component can report user-specified message.
Parallel processing of a variable number of jobs is allowed using asynchronous job processing. The example bellow shows how to process all csv files in a parallel way. First, all file names are listed by the ListFiles component. A single graph for each file name is asynchronously executed by the ExecuteGraph component. Graph run identifications (runId) are sent to the MonitorGraph component which waits for all graph results.
Asynchronous execution is available only for graphs and jobflows.
Jobflow provides a set of file operations components - list files, create, copy, move and delete files. This use-case shows how to use file operation components to process a set of remote files. The files are downloaded from a remote FTP server, each file is processed by a job, results are copied to a final destination and possible temporary files are deleted.
Graphs and jobflows can be explicitly aborted by the KillGraph or KillJobflow components. The example bellow shows how to process a list of tasks in parallel way and jobs which reached a user-specified timeout are automatically aborted.