Copy and Move - connect_cdc_sqdata - Latest

Connect CDC (SQData) Utilities

Product type
Software
Portfolio
Integrate
Product family
Connect
Product
Connect > Connect CDC (SQData)
Version
Latest
Language
English
Product name
Connect CDC (SQData)
Title
Connect CDC (SQData) Utilities
Copyright
2024
First publish date
2000
Last updated
2024-07-30
Published on
2024-07-30T19:47:43.164598

These actions provide direct access to data generated by Connect CDC SQData Captures and Publishers that are normally consumed by Engines. SQDutil can copy or move data from a source to a target datastore. Typically, the arguments are in the form of url's. Copy is non-destructive while Move is destructive. The distinction between move/copy only exist if the source type allows record-level consumption. Source datastores supporting the move versus copy distinction include data published by CDCStore and CDCzLog as well as data Captured and written to zOS LogStreams and files.

Syntax

sqdutil copy | move <source_url> [<source_url>...] <target_url> | <file_name> | DD:<dd_name> [options]

Keyword and Parameter Descriptions
Keyword Description
source_url

One, or more sources, if more than one source is being processed. See --from below.

target_url | file_name | DD:dd_name

The last url specified, if more than one source was specified. See --to below.

[options]
  • --add-stop - Add a "stop message" to the output upon completion of the task. This message will cause an Apply Engine to Stop normally.

  • --append - Open the output in append mode when the output supports it (i.e. a file).
  • --delay=<seconds> - Introduce a delay of a number of seconds after each unit-of-work (uow) commit is processed. This can be used to artificially slow down the processing or forwarding of records.
  • --from=<url> - Explicitly name a source if more than one source is being processed. Otherwise, the source is specified as the first parameter in the copy/move command.

  • --identity=<path> - Override the default NACL identity (private key file location) when connecting to a cdc:// source where path is the location of the private key.

  • --identity=<path_to/id_nacl>] (Non-z/OS only) Local file system path and file name or AKV url for the NaCl private key to be used by the Engine. On z/OS platforms both Public and Private Key files are specified at runtime by DD statements.
  • --max=<number> - Specifies the number of records to be copied/moved. If used with --uow, this specifies the number of transactions that will be copied/moved.
  • --pass-stop - Whether to forward a stop message from the source to the target or to act on it and stop the copy. Typically used under guidance of Precisely support.
  • --progress - On POSIX systems, display the number of records read and written on a regular interval.
  • --rotate-on-delay=<seconds> - Hadoop HDFS targets only. Rotate the output after the specified number of seconds. Note that rotation only occurs on Unit-of-Work boundaries.
  • --rotate-on-size=<size>[K|M|G] - Hadoop HDFS targets only. Rotate the output after the specified number of bytes are copied. Note that rotation only occurs on Unit-of-Work boundaries.
  • --sample=<number> - Sample records or units-of-work if --uow also specified, at the rate indicated. This will skip records or uows and continue sampling until the number specified is reached.
  • --skip=<number> - Skip the first number of records, or units-of-work if --uow also specified. In the case of the move command, when supported by the source, the skipped records are effectively lost.

  • --to=<url> - Explicitly name a target url. The default target url is typically the second parameter in the command.

  • --wait=<seconds> - Wait for input for the specified number of seconds. Terminate utility if none arrive. If --wait is not specified, the default behavior is to wait indefinitely.

Note:
  • copy/move requires at least one source and one target. Source can be specified using --from and target using --to if necessary. If there are arguments on the command line, they are considered urls. If no target is specified using --to, the last such argument is deemed to be a target. Every other argument is deemed to indicate a source.
  • On z/OS "DD:<dd_name>" can be specified in place of a target_url. A DD statement with that dd_name would then be used to specify a sequential dataset name for the target data.
  • On Linux, AIX and Windows, file_name can include a path and can also be specified using the file:// URL type
  • Accessing data from a Publisher requires the same authentication as a connecting Engine. Public and Private keys must be provided or in the case of non z/OS platforms, located in the default locations or specified using --identity.

Example 1

Copy or move source CDC data to a file. Both the "copy" and "move" options are one of the most common use cases and provides a way to collect CDC data for any kind of testing including diagnostic or regression testing where modifications to an Engine or Replicator script need to be confirmed against prior results. The syntax is nearly identical on all platforms and can be run at the command prompt or in a script:

sqdutil copy cdc://ZOS10:2626/DB2CDC/ENGINE1 /home/sqdata/db2out.dat --uow --max=3 --wait=10

Note: The file target can also be specified using URL syntax, file:///db2out.dat indicating in this example the current working directory.

Example 2

The "move" option is frequently used to "consume" published CDC data when flow to the actual Target datastore is not desired or required as when a capture that supports testing publishes to mutiple Engines, each writing to a different test environment target such as Development, Integration and Performance. Substituting sqdutil for an Engine and writing data to the "bit bucket" can be more easily accomplished than modifying the Capture/Publisher configurations or modifying Engines. The syntax is nearly identical on all platforms and can be run at the command prompt or in a script:

sqdutil move cdc://ZOS10:2626/DB2CDC/ENGINE1 /dev/null

Note: To keep the utility running until either the Capture disconnects or the utility is killed, leave off the parameters used to limit the amount of data "moved".

Example 3

The use case described in the first example works the same way on z/OS with JCL similar to sample members SQDUTIL included in the distribution. Copy DB2 CDC data directly from the publisher to a file, as in the sample below:
//SQDUTIL  EXEC PGM=SQDUTIL                                  
//SQDPUBL  DD DSN=&SYSUID.NACL.PUBLIC,DISP=SHR                      
//SQDPKEY  DD DSN=&SYSUID.NACL.PRIVATE,DISP=SHR                      
//SYSPRINT DD SYSOUT=*                                              
//SYSOUT   DD SYSOUT=*                                              
//DB2OUT   DD DISP=SHR,DSN=SQDATA.DB2.OUTPUT
//SQDPARMS DD *                                                      
copy cdc://ZOS10:2626/DB2CDC/ENGINE1 DD:DB2OUT --uow --max=3 --wait=10
/*
//

Example 4

While most frequently used to access CDC formatted records from a Publisher, the SQDutil utility can also read and write z/OS LogStreams and sequential files on z/OS with the same JCL but specifying the zlog URL type, as in the sample below:
//SQDUTIL  EXEC PGM=SQDUTIL                                          
//SQDPUBL  DD DSN=&SYSUID.NACL.PUBLIC,DISP=SHR                      
//SQDPKEY  DD DSN=&SYSUID.NACL.PRIVATE,DISP=SHR                      
//SYSPRINT DD SYSOUT=*                                              
//SYSOUT   DD SYSOUT=*                                              
//COPYOUT  DD DISP=(,CATLG),DSN=SQDATA.IMSCDC.LOGn.BACKUP,
// DCB=(RECFM=FB,LRECL=80,BLKSIZE=27920),
// SPACE=(CYL,(50,5),RLSE),UNIT=SYSDA
//
//SQDPARMS DD *                                                    
copy zlog:///SQDATA.IMSCDC.LOGn DD:COPYOUT
/*
//
Note:
  • The LogStream is read directly so no NACL authentication is required.
  • The source z/OS LogStream must be on the same system therefore the URL contains three consecutive "/" marks indicating localhost.
  • The direction of data flow can be reversed by substituting the following two lines:
     //COPYIN DD SQDATA.IMSCDC.LOGn.BACKUP
     copy DD:COPYIN zlog:///SQDATA.IMSCDC.LOGn