Kafka UI testing tool

  • Console utility Protokaf does not have a UI interface and the received data needs to be formatted into a json structure for better readability (and this is another additional application).

  • Kowl UI app it is convenient only for monitoring the status of topics, and only recently it became possible to read messages without a complex flow for decoding, but it still does not have the producer option.

In search of a more convenient solution, a colleague recommended Plumber, a graphical application with the ability to monitor and produce messages.

In this article I will not explain what Kafka is and how brokers work – there are already a lot of excellent materials on these topics, for example, here. I want to share my experience using this tool. My goal is not to compare it with other existing solutions, but to simply tell you how Plumber helped me simplify the process of manual testing of Kafka on stages.

Getting to know Plumber

Plumber is a GUI application designed to interact with Apache Kafka. It provides a convenient and minimalistic interface for work and includes the following features:

– Connection to Kafka broker.
– Consumer and producer in topics.
– Support for various data formats (JSON, protobuf, Avro and others).
– The message is displayed in JSON format, which makes it easier to read and further work.

Installation and configuration of Plumber

Let's start with the fact that installation and configuration are not so complicated.

Plumber installation

1. Go to releases page project.
2. Download the latest version of Plumber for your operating system (versions available for Windows, macOS and Linux).
3. Unpack the archive and run the application.
Binary assemblies are not signed, so Windows and OS will argue.
Manually bypass macOS security lock:
– or through the context menu (Control+click), followed by selecting “Open”
– or remove security attributes using the xattr command:
$ xattr -cr plumber.app

After starting Plumber, we proceed to setting up a connection to Kafka.

Setting up Kafka

1. Create a cluster by selecting New Kafka Cluster
2. To configure a connection to your Kafka broker, you must specify the following parameters:

2.1. Cluster name: Enter a friendly cluster name to quickly identify which one you want to connect to if you have several configured at the same time. You won’t be able to run two at the same time, but it’s convenient to switch.

2.2. Kafka broker address (bootstrap.servers) – specify the server DNS name or IP address. Kafka uses the broker address to establish communications and transmit messages.

2.3. Authentication and Security Options:

  • sasl.mechanisms — an authentication mechanism that determines how the client is authorized. In most cases, the PLAIN or SCRAM-SHA-256/512 mechanism is used. The choice of mechanism depends on the settings of your Kafka cluster.

  • sasl.username And sasl.password — credentials for authentication in the Kafka cluster. These parameters used in conjunction with the SASL mechanism must match the credentials configured on the Kafka broker side.

  • security.protocol – a security protocol that specifies how data will be transferred between the client and the broker. The most common values ​​are SASL_SSL or SASL_PLAINTEXT. Using SASL_SSL assumes a secure connection, while SASL_PLAINTEXT is suitable for test environments without encryption.

3. To check if everything is configured correctly, click Test Kafka Connectivity. We receive an approving OK. Next, “Save” and go to the newly created cluster.

4. The number of all topics that can be accessed through the specified Kafka broker will be automatically displayed in the Overview section.

Let's start by listening to messages.

Setting up Consumer (message reading)

1. Go to the Topics section.

2. Of course, the fastest way to find the desired topic is through search.

3. Select the topic you want to listen to by clicking on the magnifying glass.
4. Configure message consumption parameters, for example, key type (if any), deserialization format (protobuf, JSON and others).

Listening to topics in Apache Kafka when messages are serialized in protobuf format requires a .proto file. This file describes the data structure, allowing the application to correctly deserialize messages and display them in a human-readable format. Therefore, I first downloaded it from the project repository and specified the path to the .proto file.

5. Click Start. If the limit is not set, then new messages will be received in real time.

One of the useful features of Plumber is the ability to set up filters to find the desired messages in the stream. But I want to note that I encountered a problem: sometimes filtering did not work correctly, and previously sent messages were not displayed. However, after setting up the filters, there were no problems with new messages – they were displayed stably.

I can’t help but note that another useful feature for me is the message history. The application allows you to view not only current, but also the history of messages, allowing you to analyze previous events or errors that may have occurred.

6. To read the message, open it by double clicking or using the details button. You can use them in further producer tests or for recording in artifacts.

Consumer has Clear history clearing if the flow of messages becomes too large, or when you want to delete previous data for ease of analysis. This is very convenient when you need to check with a clean stream to test a new script.

Setting up Producer (sending messages)

We've reached the most useful function of Plumber – producer. The tool allows you to simulate sending messages from a related service, which opens up a lot of testing opportunities. For example, I modify the data: I replace the value in fields such as id, dates, statuses, etc., to check the stability of processing. Or I change the message structure: rearranging the order of fields or deleting/adding new ones to simulate changes in interaction protocols. Helps and speeds up work when testing corner cases: passing incorrect data or empty values. When testing rare or complex cases that are difficult to reproduce naturally, for example, when a message is generated only in rare conditions or there is a long wait for sending. In such cases, using producer will significantly speed up the testing process without waiting for such situations to actually occur.

Using a producer allows you to effectively test logic during initial checks, for example, when your service with modifications is ready earlier than the producer service. This allows you to catch potential bugs before integration testing and saves time. Of course, full integration is still needed to make sure that all services interact correctly and everything works as intended.

Let's get started.

1. Click on the Producer button.

2. In the window that opens, select a topic from the list. Unfortunately, there is no search, you have to search with your eyes and this can be painful if there are a lot of topics.
3. Select the data format that will be sent. I still select protobuf and load the proto file.

4. Copy the message from the consumer and change the data a little. For example, I'll set the field to automaticRouting meaning falseand in the field deleted_at I'll indicate the date. Click Produce to topic to send a message. In the right column you can see at what time the message was sent.

5. You can check in the database what has changed/recorded when receiving such a message.

Benefits of using Plumber

To summarize, I would like to once again highlight the key advantages, based on my experience:

  1. Minimalistic and intuitive graphical interface: Plumber makes working with Kafka easier and faster. Thanks to a simple and functional interface, interaction with messages becomes clear and does not require additional effort (applications).

  2. Flexibility in working with data formats: Plumber supports various formats such as protobuf, JSON, which allows you to adapt to specific project requirements.

  3. Easy to set up: Settings are easy and quick, which is especially important in dynamic work environments.

  4. Open source and free to use.

  5. Long term perspective. In modern conditions, it is important to have tools that do not depend on political (or other) situation(s). But there is also one drawback here: at the moment, support for the application has been suspended by the developer.

However, as I already mentioned, the application also has some minor drawbacks. Another problem I encountered was the program quitting unexpectedly. Luckily, the cluster settings are saved so you don't have to enter them again. But this becomes unpleasant if you prepared a message for producer directly in Plumber, and not in a text editor. After the first such incident, I began to prepare the body of the message in notepad, and then transfer it to Plumber. This makes it possible to return to prepared data in case of failure.

Conclusion

Plumber has become a great tool for me to manually test microservices using Kafka. I'm not calling or claiming that this is a super application without flaws, but it has greatly simplified the testing process for me compared to previous tools.

If you work with Kafka and are looking for a convenient UI testing tool, Plumber is just the solution that can make your work much easier. I hope my experience will be useful for those who are faced with similar tasks.

Cooper's tech team (ex SberMarket) manages social networks with news and announcements. If you want to know what's under the hood of high-load e-commerce, follow us on Telegram and on YouTube. And also listen podcast “For tech and these” from our it managers.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *