Grouping process model tasks - Data360_DQ+ - Latest

Data360 DQ+ Help

Product type
Software
Portfolio
Verify
Product family
Data360
Product
Data360 DQ+
Version
Latest
Language
English
Product name
Data360 DQ+
Title
Data360 DQ+ Help
Copyright
2024
First publish date
2016
ft:lastEdition
2024-07-09
ft:lastPublication
2024-07-09T15:09:58.774265

This example describes how to use Execute Process Task nodes to reduce the number of direct dependencies within large process models.

An important consideration when creating process models is that each process model embodies a single process and that this process is designed to terminate if there is a failure at any one node within the process model. Therefore, if you are executing multiple data stages that are not strictly dependent on one another, it can be useful to create "child" process models that are referenced by Execute Process Task nodes within a parent process model.

 

For example, if any of the nodes (1-9) fail in the following process model, then the entire process will terminate. This means that even nodes that do not necessarily depend on the failed node will also terminate.

Large Process Model

 

To avoid termination of an entire process model due to the failure of a single node, when there are other nodes that can continue running despite the node's failure, you can use Execute Process Task nodes.

 

For example, consider grouping the Execute Stage Tasks as follows.

 

Grouping processes

You could refactor these groupings into their own process models, and then each new Process Model could be added as an additional process to the original process model using Execute Process Tasks.

 

This would result in the following process model (PM) which references process models A and B via Execute Process Tasks:

Child Process Models

 

The advantage of bringing in separate processes via these Execute Process Tasks is that a separate process (A or B in this example) could fail and it would not result in termination of the entire process model (PM in this example).