Apply frequency - connect_cdc_sqdata - Latest

Connect CDC (SQData) Change Data Capture

Product type
Software
Portfolio
Integrate
Product family
Connect
Product
Connect > Connect CDC (SQData)
Version
Latest
Language
English
Product name
Connect CDC (SQData)
Title
Connect CDC (SQData) Change Data Capture
Copyright
2024
First publish date
2000
ft:lastEdition
2024-09-05
ft:lastPublication
2024-09-05T15:00:09.754973

The size of the transient storage area is also affected by the frequency changed data is applied to a target. Precisely recommends switching to a streaming Apply model for your target or raising the Apply frequency as high as practical when capturing rapidly changing tables in high volume Db2 environments, especially if space is an issue.

While for example changes from a Db2 source to an Oracle, Db2/LUW or other target may only need to be applied once a day, the transient storage would have to be sized large enough to store all of the changed data accumulated during the one-day period. Often however, the estimated size will prove to be inadequate. When that happens the capture will eventually stop mining the Db2 Log and wait for an Engine to connect and Publishing to resume. When the Capture does finally request the next log record from Db2, the required Db2 Archive Logs may have become inaccessible. This would occur if the wait period was long enough or the volume of data changing large enough that the Archive Log retention period was too short.

Best practices for Db2 Archive Log retention will normally ensure that the Archive Logs are accessible. In some environments however this can become an issue. Precisely recommends analysis of the total Db2 workload in all cases because even though only a fraction of all the existing tables may be configured for capture, the Db2/z Log Reader capture potentially requires access to every log Archived since the last Apply cycle.