Setting up Precisely Connect CDC Apply Engine - connect_cdc_sqdata - aws_mainframe_modernization_service - Latest

AWS Mainframe Modernization Data Replication for IBM z/OS

Product type
Software
Portfolio
Integrate
Product family
Connect
Product
AWS Mainframe Modernization > AWS Mainframe Modernization Service
Version
Latest
ft:locale
en-US
Product name
AWS Mainframe Modernization
ft:title
AWS Mainframe Modernization Data Replication for IBM z/OS
Copyright
2025
First publish date
2000
ft:lastEdition
2025-02-10
ft:lastPublication
2025-02-10T15:55:15.122000
  • Example script in the working directory of the Apply Engine.
    • /dbd and ./cpy are the working directory where the IMS DBD source and Copybook of IMS segments are stored
    • IP:PORT is the IP address of the mainframe and PORT number on which the daemon process is running on mainframe system
    • sqdata-mks-topic is the topic name of Kafka created in earlier step where message will be produced
    JOBNAME IMSTOORA;
    REPORT EVERY 5;
    -- Change Op has been ‘I’ for insert, ‘D’ for delete , and ‘R’ for Replace. For RDS it is 'U' for update
    OPTIONS
      CDCOP('I', 'U', 'D'),
      PSEUDO NULL = NO,
      USE AVRO COMPATIBLE NAMES;
    --       SOURCE DESCRIPTIONS
    DESCRIPTION IMSDBD ./dbd/DBCUSTSQ AS DBCUSTSQ;
    BEGIN GROUP IMS_SRC;
      DESCRIPTION COBOL ./cpy/custmast AS custmast_file
              FOR SEGMENT CUSTMAST
              IN DATABASE DBCUSTSQ;
      DESCRIPTION COBOL ./cpy/custaddr AS custaddr_file
              FOR SEGMENT CUSTADDR
              IN DATABASE DBCUSTSQ;
    END GROUP;
    --       SOURCE DATASTORE (IP & Publisher name)
      DATASTORE cdc://IP:PORT/IMSPUBY/IMSTOORA
        OF IMSCDC
        AS CDCIN
        DESCRIBED BY GROUP IMS_SRC ;
    --       TARGET DATASTORE(s)
      DATASTORE 'kafka:///sqdata-mks-topic/root_key'
        OF JSON
        AS CDCOUT
        DESCRIBED BY GROUP IMS_SRC;
    --       MAIN SECTION
    PROCESS INTO CDCOUT
    SELECT
    {
      REPLICATE(CDCOUT)
    }
    FROM CDCIN;
    
Script for Kafka producer configuration file sqdata_kafka_producer.conf in the working directory of the Apply Engine.
builtin.features=SASL_SCRAM
security.protocol=SASL_SSL
sasl.mechanism=SCRAM-SHA-512
sasl.username=
sasl.password=
metadata.broker.list=