How to Process Notifications With Kafka, MinIO, And Python

a Macbook
Photograph by Christopher Gower on Unsplash

Event-driven architecture has been a preferred strategy that allows decoupled functions to speak with one another.

As an example how this structure works, we’ll arrange Minio to ship subject notifications via Kafka. Then we’ll create a easy Kafka listener in Python to eat the occasion data.

Let’s get began!

First, let’s put together our native improvement infrastructure. The simplest approach to get began is by operating a docker-compose.yml file:

We create containers for minio, kafka, and zookeeper.

  • The MinIO service acts as a producer and sends notification information to Kafka.
  • Kafka is a streaming system that can eat MinIO occasion information and deal with the data accordingly.
  • Zookeeper is used to trace the standing of nodes within the Kafka cluster and keep a listing of Kafka subjects, partitions, and many others.

Necessary notes:

  • MinIO’s atmosphere configuration defines the Kafka properties. For instance, dealer port, notification subject title.
  • The Kafka dealer is uncovered on port 9092. So, we are able to check with it for native testing.
  • All providers ought to run in the identical community, on this case, kafka-net.

Run the docker-compose.yml file:

$ docker-compose up -D

Then log in to the MinIO’s console operating on http://localhost:9001/ utilizing credentials: minioadmin/minioadmin.

Be sure that MinIO and Kafka are appropriately configured. Navigate to Settings ->Notifications. The Notification Endpoints ought to be on-line:

Notification endpoints for Kafka

Create a brand new bucket:

Creating a brand new bucket

Click on on the brand new bucket and choose Handle.

Subscribe to the notification occasions by clicking the Subscribe To Occasion button:

Subscribing to a bucket occasion

The ARN drop-down ought to mechanically recommend Kafka. Subscribe to some occasions, e.g. put, get, delete.

We’ve already configured MinIO and Kafka and subscribed to the notification occasions. So, now once we add a file, we should always get notified. test if the notification is obtained? Let’s use kcat, a light-weight, simple-to-use message reader software.

To put in kcat on Ubuntu, run in Terminal:

$ apt-get set up kafkacat

Then, let’s begin the software, passing the dealer’s port and the subject’s title as arguments:

$ kafkacat -b localhost:9092 -t my-notifications

That’s it!

Now, let’s add a file to the bucket in MinIO.

Your kcat console ought to instantly present you the notification:

Kcat output

Superior! Now we efficiently obtain notifications from MinIO by way of Kafka.

A number of Kafka shoppers can be found for Python. On this tutorial, I’m utilizing the kafka-python library:

$ pip3 set up kafka-python

Let’s create a brand new Python file with the next content material:

The kafka-listener.py code

We initialize a KafkaConsumer with the next arguments:

  • subject — the title of the subject to take heed to.
  • value_deserializer — deserializes the info into a standard JSON format. This format is useful for versatile information manipulation.

Word that there’s one other optionally available argument,bootstrap_listener whose default worth is localhost:9092. Since that is the one we use, I didn’t embody it within the argument listing.

We begin the Kafka listener in a background thread. This system polls from the subject in a loop. When a message arrives, we merely print out the info. Often, we might execute some code to deal with the obtained data.

Run this system:

$ python3 kafka-listener.py

Let’s take a look at it by importing a brand new file within the MinIO Bucket.

It’s best to see a ConsumerRecord within the Terminal’s console:

Outcomes consumed by Python Kafka Listener

As you may see, this system works as anticipated. It was fairly easy to set it up!

After all, it ought to work for delete and obtain occasions, too.

On this brief tutorial, I confirmed you the best way to obtain bucket notifications from MinIO utilizing Kafka. We additionally used the kafka-python library to create a small Python program for consuming the data.

The patron configuration on this instance is primary for demo functions. For extra superior utilization, take a look at the documentation.

In the event you loved this subject, you would possibly like an analogous one. It’s primarily based on MinIO, RabbitMQ, and Java expertise stack:

Thanks for studying, and completely happy coding!

More Posts