In this step, you perform two tasks.
- The first task is to create an IAM policy that grants access to create topics on the cluster and to send data to those topics.
- The second task is to create an IAM role and associate this policy with it. In a later step, you use this role for Apply Engine EC2 that assumes this role and uses it to create a topic on the cluster and to send data to that topic.
This section contains:
Create an IAM policy to write to topics
- Open the IAM console at https://console.aws.amazon.com/iam/.
- On the navigation pane, choose Policies.
- Select Create Policy.
- Click the JSON tab and replace the JSON in the editor window with the following JSON.
- Replace
region
with the code of the AWS region where you created your cluster. - Replace
Account-ID
with your account ID. - Replace the boldfaced section with your MSK Cluster ARN.
{ "Version": "2012-10-17", "Statement": [ { "Action": [ "aws-marketplace:MeterUsage" ], "Effect": "Allow", "Resource": "*" }, { "Effect": "Allow", "Action": [ "kafka-cluster:Connect", "kafka-cluster:AlterCluster", "kafka-cluster:DescribeCluster" ], "Resource": [ "arn:aws:kafka:region:Account-ID:cluster/MSKTutorialCluster/*" ] }, { "Effect": "Allow", "Action": [ "kafka-cluster:*Topic*", "kafka-cluster:WriteData", "kafka-cluster:ReadData" ], "Resource": [ "arn:aws:kafka:region:Account-ID:topic/MSKTutorialCluster/*" ] }, { "Effect": "Allow", "Action": [ "kafka-cluster:AlterGroup", "kafka-cluster:DescribeGroup" ], "Resource": [ "arn:aws:kafka:region:Account-ID:group/MSKTutorialCluster/*" ] } ] }
- Replace
Create secure policies using IAM access control
For more information about writing secure policies, refer to IAM access control.
- Select Next: Tags.
- Select Next: Review.
- Enter a descriptive policy name. For example,
msk-tutorial-policy
. - Select Create policy.
Create an IAM role and attach a policy
- On the navigation pane, select Roles.
- Select Create role.
- Under Common use cases, select EC2, then select Next: Permissions.
- Enter the name of the policy that you previously created for this tutorial in the search box. Then, select the box to the left of the policy.
- Select Next: Tags.
- Select Next: Review.
- Enter a descriptive role name. For example,
msk-role
. - Select Create role.
Attach IAM role to the Apply Engine EC2
Attach above created IAM role msk-role to the Apply Engine EC2 create in earlier steps.
Create a topic
In this step of Getting Started Using Amazon MSK, you install Apache Kafka client libraries and tools on the client machine, and then you create a topic.
To find the version of your MSK cluster:
- Sign in using this URL: https://eu-west-2.console.aws.amazon.com/msk/.
- Select the MSK cluster.
- Note the version of Apache Kafka used on the cluster.
- Replace instances of Amazon MSK version numbers in this tutorial with the version obtained in Step 3.
To create a topic on the client machine:
- Open the Amazon EC2 console at https://console.aws.amazon.com/ec2/.
- In the navigation pane, choose Instances. Then select the check box beside the name of the Apply Engine EC2 Machine.
- Select Actions, and then choose Connect. Follow the instructions in the console to connect to your client machine.
- Install Kafka.
[ec2-user@ip-172-31-23-34 ~] $ sudo yum install Librdkafka.so Loaded plugins: extras_suggestions, langpacks, priorities, update-motd amzn2-core | 3.6 kB 00:00:00 amzn2extra-BCC | 3.0 kB 00:00:00 amzn2extra-docker | 2.9 kB 00:00:00 amzn2extra-epel | 3.0 kB 00:00:00 amzn2extra-mate-desktop1.x | 3.0 kB 00:00:00 epel/x86_64/metalink | 24 kB 00:00:00 epel | 4.7 kB 00:00:00 google-chrome | 1.3 kB 00:00:00 nodesource | 2.5 kB 00:00:00 (1/2): epel/x86_64/updateinfo | 1.0 MB 00:00:00 (2/2): epel/x86_64/primary_db | 7.0 MB 00:00:00 313 packages excluded due to repository priority protections No package Librdkafka.so available. Error: Nothing to do
sudo yum isntall librdkafka-devel
- Install Java on the client machine by running the following command:
sudo yum -y install java-11
- Run the following command to download Apache Kafka.
wget https://archive.apache.org/dist/kafka/{YOUR MSK VERSION}/kafka_2.13-{YOUR MSK VERSION}.tgz
Note: If you want to use a mirror site other than the one used in this command, you can choose a different one on the Apache website. - Run the following command in the directory where you downloaded the TAR file in the previous step.
tar -xzf kafka_2.13-{YOUR MSK VERSION}.tgz
- Go to the
kafka_2.13-{YOUR MSK VERSION}/libs
directory, then run the following command to download the Amazon MSK IAM JAR file. The Amazon MSK IAM JAR makes it possible for the client machine to access the cluster.wget https://github.com/aws/aws-msk-iam-auth/releases/download/v1.1.1/aws-msk-iam-auth-1.1.1-all.jar
- Go to the
kafka_2.13-{YOUR MSK VERSION}/bin
directory. Copy the following property settings and paste them into a new file. Name the fileclient.properties
and save it.security.protocol=SASL_SSL sasl.mechanism=AWS_MSK_IAM sasl.jaas.config=software.amazon.msk.auth.iam.IAMLoginModule required;sasl.client.callback.handler.class=software.amazon.msk.auth.iam.IAMClientCallbackHandler
- Open the Amazon MSK console at https://console.aws.amazon.com/msk/.
- Wait for the status of your cluster to become Active. This might take several minutes. After the status becomes Active, choose the cluster name. This takes you to a page containing the cluster summary.
- Select View client information.
- Copy the connection string for the private endpoint.
You will get three endpoints for each of the brokers. You only need one broker endpoint for the following step.
- Run the following command, replacing
BootstrapServerString
with one of the broker endpoints that you obtained in the previous step.<path-to-your-kafka-installation>/bin/kafka-topics.sh --create --bootstrap-server BootstrapServerString --command-config client.properties --replication-factor 3 --partitions 1 --topic MSKTutorialTopic
If the command succeeds, you see the following message:
Created topic MSKTutorialTopic.
Produce and consume data
You can create a producer and consume data and make sure Topic is working fine.
- Run the following command to start a console producer. Replace
BootstrapServerString
with the plaintext connection string that you obtained in Create a topic. For instructions on how to retrieve this connection string, see Getting the bootstrap brokers for an Amazon MSK cluster.<path-to-your-kafka-installation>/bin/kafka-console-producer.sh --broker-list BootstrapServerString --producer.config client.properties --topic MSKTutorialTopic
- Enter any message that you want, and press Enter. Repeat this step two or three times. Every time you enter a line and press Enter, that line is sent to your Apache Kafka cluster as a separate message.
- Keep the connection to the client machine open, and then open a second, separate connection to that machine in a new window.
- In the following command, replace
BootstrapServerString
with the plaintext connection string that you saved earlier. Then, to create a console consumer, run the following command with your second connection to the client machine.<path-to-your-kafka-installation>/bin/kafka-console-consumer.sh --bootstrap-server BootstrapServerString --consumer.config client.properties --topic MSKTutorialTopic --from-beginning
You start seeing the messages you entered earlier when you used the console producer command.
- Enter more messages in the producer window, and watch them appear in the consumer window.