Multi schemas in one Kafka topic

When working with a combination of Confluent Schema Registry + Apache Kafka, you may notice that pushing messages with different Avro schemas to one topic was not possible. Starting with Confluent Schema Registry version 4.1.0, you can do it and I will explain to you how.
Continue reading “Multi schemas in one Kafka topic”

Spring Cloud Stream + Apache Kafka(PollableMessageSource)

Hi there! Recently Spring Cloud Stream 2.0 introduced a new feature – polled consumers(PollableMessageSource), where the application can control the reading rate from a source (Kafka, RabbitMQ), basically you can pause your stream. It is especially helpful in the case of Kafka. Before the new feature, you will just read continuously payload from the topic as much as it has, non-stop. What if you need to pause your stream? Say we are getting messages from Kafka topic and then we are sending data to some external service and at some point, external service becomes unavailable.
For this use case, I created an application, that deals with such an issue.
Continue reading “Spring Cloud Stream + Apache Kafka(PollableMessageSource)”

Registering schema(REST API) with Confluent Schema Registry.

In this post, I want to share how you can register (through REST API) your schema with Confluent Schema Registry.
Sometimes it can be tricky because you have to know how to escape characters and use proper schema format.
Continue reading “Registering schema(REST API) with Confluent Schema Registry.”

Spring Boot + Apache Spark + Apache Kafka

In this post, I will explain how you can solve an issue with serialization/deserialization of Spring beans and Kafka in Spark application.
As an example, I found a similar question on StackOverflow.
Continue reading “Spring Boot + Apache Spark + Apache Kafka”