Create an IAM role - connect_cdc_sqdata - aws_mainframe_modernization_service - Latest

AWS Mainframe Modernization Data Replication for IBM z/OS

Product type
Software
Portfolio
Integrate
Product family
Connect
Product
AWS Mainframe Modernization > AWS Mainframe Modernization Service
Version
Latest
ft:locale
en-US
Product name
AWS Mainframe Modernization
ft:title
AWS Mainframe Modernization Data Replication for IBM z/OS
Copyright
2025
First publish date
2000
ft:lastEdition
2025-02-10
ft:lastPublication
2025-02-10T15:55:15.122000
In this step, you perform two tasks.
  • The first task is to create an IAM policy that grants access to create topics on the cluster and to send data to those topics.
  • The second task is to create an IAM role and associate this policy with it. In a later step, you use this role for Apply Engine EC2 that assumes this role and uses it to create a topic on the cluster and to send data to that topic.

Create an IAM policy to write to topics

  1. Open the IAM console at https://console.aws.amazon.com/iam/.
  2. On the navigation pane, choose Policies.
  3. Select Create Policy.
  4. Click the JSON tab and replace the JSON in the editor window with the following JSON.
    1. Replace region with the code of the AWS region where you created your cluster.
    2. Replace Account-ID with your account ID.
    3. Replace the boldfaced section with your MSK Cluster ARN.
      {
          "Version": "2012-10-17",
          "Statement": [
              {
                  "Action": [
                      "aws-marketplace:MeterUsage"
                  ],
                  "Effect": "Allow",
                  "Resource": "*"
              },
              {
                  "Effect": "Allow",
                  "Action": [
                      "kafka-cluster:Connect",
                      "kafka-cluster:AlterCluster",
                      "kafka-cluster:DescribeCluster"
                  ],
                  "Resource": [
                      "arn:aws:kafka:region:Account-ID:cluster/MSKTutorialCluster/*"
                  ]
              },
              {
                  "Effect": "Allow",
                  "Action": [
                      "kafka-cluster:*Topic*",
                      "kafka-cluster:WriteData",
                      "kafka-cluster:ReadData"
                  ],
                  "Resource": [
                      "arn:aws:kafka:region:Account-ID:topic/MSKTutorialCluster/*"
                  ]
              },
              {
                  "Effect": "Allow",
                  "Action": [
                      "kafka-cluster:AlterGroup",
                      "kafka-cluster:DescribeGroup"
                  ],
                  "Resource": [
                      "arn:aws:kafka:region:Account-ID:group/MSKTutorialCluster/*"
                  ]
              }
          ]
      }
      

Create secure policies using IAM access control

For more information about writing secure policies, refer to IAM access control.
  1. Select Next: Tags.
  2. Select Next: Review.
  3. Enter a descriptive policy name. For example, msk-tutorial-policy.
  4. Select Create policy.

Create an IAM role and attach a policy

  1. On the navigation pane, select Roles.
  2. Select Create role.
  3. Under Common use cases, select EC2, then select Next: Permissions.
  4. Enter the name of the policy that you previously created for this tutorial in the search box. Then, select the box to the left of the policy.
  5. Select Next: Tags.
  6. Select Next: Review.
  7. Enter a descriptive role name. For example, msk-role.
  8. Select Create role.

Attach IAM role to the Apply Engine EC2

Attach above created IAM role msk-role to the Apply Engine EC2 create in earlier steps.



Create a topic

In this step of Getting Started Using Amazon MSK, you install Apache Kafka client libraries and tools on the client machine, and then you create a topic.

To find the version of your MSK cluster:
  1. Sign in using this URL: https://eu-west-2.console.aws.amazon.com/msk/.
  2. Select the MSK cluster.
  3. Note the version of Apache Kafka used on the cluster.
  4. Replace instances of Amazon MSK version numbers in this tutorial with the version obtained in Step 3.


To create a topic on the client machine:
  1. Open the Amazon EC2 console at https://console.aws.amazon.com/ec2/.
  2. In the navigation pane, choose Instances. Then select the check box beside the name of the Apply Engine EC2 Machine.
  3. Select Actions, and then choose Connect. Follow the instructions in the console to connect to your client machine.
  4. Install Kafka.
    [ec2-user@ip-172-31-23-34 ~]
    $ sudo yum install Librdkafka.so
    Loaded plugins: extras_suggestions, langpacks, priorities, update-motd
    amzn2-core                                                                                  | 3.6 kB  00:00:00     
    amzn2extra-BCC                                                                              | 3.0 kB  00:00:00     
    amzn2extra-docker                                                                           | 2.9 kB  00:00:00     
    amzn2extra-epel                                                                             | 3.0 kB  00:00:00     
    amzn2extra-mate-desktop1.x                                                                  | 3.0 kB  00:00:00     
    epel/x86_64/metalink                                                                        |  24 kB  00:00:00     
    epel                                                                                        | 4.7 kB  00:00:00     
    google-chrome                                                                               | 1.3 kB  00:00:00     
    nodesource                                                                                  | 2.5 kB  00:00:00     
    (1/2): epel/x86_64/updateinfo                                                               | 1.0 MB  00:00:00     
    (2/2): epel/x86_64/primary_db                                                               | 7.0 MB  00:00:00     
    313 packages excluded due to repository priority protections
    No package Librdkafka.so available.
    Error: Nothing to do
    sudo yum isntall librdkafka-devel
  5. Install Java on the client machine by running the following command:
    sudo yum -y install java-11
  6. Run the following command to download Apache Kafka.
    wget https://archive.apache.org/dist/kafka/{YOUR MSK VERSION}/kafka_2.13-{YOUR MSK VERSION}.tgz
    Note: If you want to use a mirror site other than the one used in this command, you can choose a different one on the Apache website.
  7. Run the following command in the directory where you downloaded the TAR file in the previous step.
    tar -xzf kafka_2.13-{YOUR MSK VERSION}.tgz
  8. Go to the kafka_2.13-{YOUR MSK VERSION}/libs directory, then run the following command to download the Amazon MSK IAM JAR file. The Amazon MSK IAM JAR makes it possible for the client machine to access the cluster.
    wget https://github.com/aws/aws-msk-iam-auth/releases/download/v1.1.1/aws-msk-iam-auth-1.1.1-all.jar
  9. Go to the kafka_2.13-{YOUR MSK VERSION}/bin directory. Copy the following property settings and paste them into a new file. Name the file client.properties and save it.
    security.protocol=SASL_SSL
    sasl.mechanism=AWS_MSK_IAM
    sasl.jaas.config=software.amazon.msk.auth.iam.IAMLoginModule required;sasl.client.callback.handler.class=software.amazon.msk.auth.iam.IAMClientCallbackHandler
  10. Open the Amazon MSK console at https://console.aws.amazon.com/msk/.
  11. Wait for the status of your cluster to become Active. This might take several minutes. After the status becomes Active, choose the cluster name. This takes you to a page containing the cluster summary.
  12. Select View client information.


  13. Copy the connection string for the private endpoint.

    You will get three endpoints for each of the brokers. You only need one broker endpoint for the following step.

  14. Run the following command, replacing BootstrapServerString with one of the broker endpoints that you obtained in the previous step.
    <path-to-your-kafka-installation>/bin/kafka-topics.sh --create --bootstrap-server BootstrapServerString --command-config client.properties --replication-factor 3 --partitions 1 --topic MSKTutorialTopic

    If the command succeeds, you see the following message: Created topic MSKTutorialTopic.

Produce and consume data

You can create a producer and consume data and make sure Topic is working fine.
  1. Run the following command to start a console producer. Replace BootstrapServerString with the plaintext connection string that you obtained in Create a topic. For instructions on how to retrieve this connection string, see Getting the bootstrap brokers for an Amazon MSK cluster.
    <path-to-your-kafka-installation>/bin/kafka-console-producer.sh --broker-list BootstrapServerString --producer.config client.properties --topic MSKTutorialTopic
  2. Enter any message that you want, and press Enter. Repeat this step two or three times. Every time you enter a line and press Enter, that line is sent to your Apache Kafka cluster as a separate message.
  3. Keep the connection to the client machine open, and then open a second, separate connection to that machine in a new window.
  4. In the following command, replace BootstrapServerString with the plaintext connection string that you saved earlier. Then, to create a console consumer, run the following command with your second connection to the client machine.
    <path-to-your-kafka-installation>/bin/kafka-console-consumer.sh --bootstrap-server BootstrapServerString --consumer.config client.properties --topic MSKTutorialTopic --from-beginning

    You start seeing the messages you entered earlier when you used the console producer command.

  5. Enter more messages in the producer window, and watch them appear in the consumer window.