Options on this tab allow you to control processing options.
- Batch size
- This option specifies how many records or transactions are read in a batch before they are written to the model. You can adjust this setting to optimize performance. The default setting is 500. Increasing batch size increases memory usage.
-
- For systems that have processing capacity to support increased memory usage, you can increase batch size to improve performance.
- On slow networks, do not set a very large batch size.
- If different instances of Context Graph stages will often write simultaneously
to models, you can set this option to a small to medium size (100 to no more than a
few thousand) to reduce overall memory usage.
- A batch is rolled back should a statement fail. The larger the batch size, the longer it takes to perform the rollback.