Use Confluent.io with HiveMQ Cloud Confluent Integration

This article will describe how to create a Kafka cluster using Confluent.io service and use it in the HiveMQ Cloud Integration with Confluent.

Instructions

  1. Go to confluent.io and sign up for a new account. It will require credit card info, but it will also give $400 credit, so the credit card will not be charged for a few weeks, until you use whole credit.

  2. As you sign-up you get to the Welcome page:

    image-20240731-063454.png

  3. Click on “Build a client” because you are going to need a client. Confluent.io provides steps for different programming languages:

    image-20240731-063636.png

  4. Click on Java

  5. Set up API key:

  6. When offered to select the key type, select “My Account (for development)”:

  7. The system will create the key. Input the name for the key and download it:

  8. Set up a topic. Input the name for your test topic, and the system will generate the curl command that solves two problems:

    1. Verifying your keys and connection the confluent cluster

    2. Creating the topic in the confluent cluster

  9. Copy the curl command and execute in the your bash termianal. If the command works, it will successfully create the topic. Request:

    curl \ -X POST \ -H "Content-Type: application/json" \ -H "Authorization: Basic VVFYRVVFQUhCMlRQWkZBSzpNNjRITFdCUUNvRjFOZW13WFhGbFV2Y1FjUE5CejZCMmJJUFIvVkNpbHNTY2p0VCtZa3dSbUNwL3IwNlUrekNB" \ https://pkc-w7d6j.germanywestcentral.azure.confluent.cloud:443/kafka/v3/clusters/lkc-x53d1g/topics \ -d '{"topic_name":"MyTopic1"}'

    Response:

    { "kind": "KafkaTopic", "metadata": { "self": "https://pkc-w7d6j.germanywestcentral.azure.confluent.cloud/kafka/v3/clusters/lkc-x53d1g/topics/MyTopic1", "resource_name": "crn:///kafka=lkc-x53d1g/topic=MyTopic1" }, "cluster_id": "lkc-x53d1g", "topic_name": "MyTopic1", "is_internal": false, "replication_factor": 3, "partitions_count": 6, "partitions": { "related": "https://pkc-w7d6j.germanywestcentral.azure.confluent.cloud/kafka/v3/clusters/lkc-x53d1g/topics/MyTopic1/partitions" }, "configs": { "related": "https://pkc-w7d6j.germanywestcentral.azure.confluent.cloud/kafka/v3/clusters/lkc-x53d1g/topics/MyTopic1/configs" }, "partition_reassignments": { "related": "https://pkc-w7d6j.germanywestcentral.azure.confluent.cloud/kafka/v3/clusters/lkc-x53d1g/topics/MyTopic1/partitions/-/reassignment" }, "authorized_operations": [] }

     

  10. Update the curl command to use GET instead of PUT and get which topics you have already:

    curl \ -X GET \ -H "Content-Type: application/json" \ -H "Authorization: Basic VVFYRVVFQUhCMlRQWkZBSzpNNjRITFdCUUNvRjFOZW13WFhGbFV2Y1FjUE5CejZCMmJJUFIvVkNpbHNTY2p0VCtZa3dSbUNwL3IwNlUrekNB" \ https://pkc-w7d6j.germanywestcentral.azure.confluent.cloud:443/kafka/v3/clusters/lkc-x53d1g/topics

     

  11. Copy the server, username and password:

     

  12. Use the server, username and password in the HiveMQ Cloud Serverless - Integrations -Confluent:

     

  13. For the topic mapping, HiveMQ to Confluent, use the topic which you created with the curl command as the destination topic:

    For the reverse mapping from Conluent to HiveMQ, use the topic which you created with the curl command as the source topic:

     

  14. Enable the integration and wait until it is applied:

     

  15. Go to the WebClient in the HiveMQ Cloud Serverless console. Connect the client with auto-generated credentials. Subscribe the client to listen to all topics. Then publish a test message to the source MQTT topic.

     

  16. Expected result is that the message will be published to the toKafka topic, get forwarded to the Confluent topic MyTopic1, get pulled from the Confluent MyTopic1 to the MQTT topic fromKafka:

     

  17. In the confluent.io go to the Topics section and check the MyTopic1 topic:

     

  18. Messages produced to the topic will be listed there:

Related articles