Jeremy Punsalan

Senior Software Engineer

Software Engineering Manager

Freelancer

Jeremy Punsalan

Senior Software Engineer

Software Engineering Manager

Freelancer

Blog Post

Hello Kafka!

July 25, 2020 Miscellaneous

In today's modern architecture, we are already shifting to Event-Driven Architecture or Reactive Programming Paradigm. Apache Kafka is one popular platform that everyone should know or at least used on their past projects.

kafka

What is Apache Kafka?

According to their official website, Kafka is a distributed streaming platform intended for building applications with real-time data streams and pipelines. 

Kafka is often used by popular companies like Twitter and Netflix. 

Most often, Kafka is useful and attributed to Event-Driven Architecture. 

So wait, what the f*** is Event-Driven Architecture??

event driven

Imagine the difference between asking your crush for a date thru email vs asking her out personally. You may get the answer immediately when you ask her out directly, but you cant do anything else while waiting for her response. Thru email, you can drink coffee, code your fix, or whatever, while still waiting for her response. Asking a girl out is what we call an event. You can compare asking a girl out directly to a request-response design (http request-response) since you require her to answer you immediately. Also asking her out thru mail can be compared to event-driven design – where you can send the request and you dont wait for any response, so for the meantime you can do other tasks while waiting, but you’ll receive the response once it is ready.

event-driven-arch

The event-driven architecture pattern is a popular distributed asynchronous architecture pattern used to produce highly scalable applications. It is also highly adaptable and can be used for small applications and as well as large, complex ones. The event-driven architecture is made up of highly decoupled, single-purpose event processing components that asynchronously receive and process events. 

I will discuss more about Event-Driven Architecture, as well as Reactive Programming in the upcoming posts, but for now, lets go to the “Hello Kafka” Spring Boot App.

"Say Hello to Kafka!"

To dip into the streams of Kafka, I created a simple Spring Boot App that publishes a java object into the Kafka, and at the same time consumes the objects from the topic from Kafka. The Kafka that I used is deployed into an AWS EC2 cluster. (In this exercise you can just install Kafka on your own machine).

I will assume that you already know the Spring Boot, so we wont be tackling about that. To get started:

1. Install Kafka

Installing Kafka should be relatively easy, you may refer to other tutorials in the net on how to install Kafka (I wont be covering on how to install Kafka, since this post centers around the Spring Boot App itself.)

2. Create a Spring Boot App - add dependency objects to it

You may create a Spring Boot App either using Spring Initialzr or Spring STS tools in Eclipse. Neither way, once you have the project in your favorite IDE, you need these dependencies:

spring-boot-starter-web is the plugin for spring web, since we are creating rest endpoint to publish a java object we need this dependency. 

spring-kafka is the kafka library tools for implementing spring boot with kafka. This includes serialization, deserialization, KafkaTemplate and more.

3. Create Entity Object

entity

This is just pure POJO – the object where we store the information. This will be send to the Kafka and we’ll consume the messages in this form as well. In this case, I created a ServiceProvider class with fields id, name and occupation.

4. Create Producer Service

producer

This service is the producer – it pushes the message into Kafka. In this case, we are pushing the ServiceProvider class into Kafka broker with the topic name “providers”. 

Take note on KafkaTemplate — this is similar with Spring Data Repository (ie JPARepository, MongoDBRepository etc), this is the one sending the “message” or the object to Kafkabroker.

If youre wondering the configuration of the Kafka Broker and such, we will discuss on this on the later part. Just relax.

5. Create Consumer Service

consumer

This is the listener class that will consume anything from the particular broker topic “providers” from Kafka. Take note that we only need to create an execution method (consume) and declare the object parameter we expected to consume. And then add the annotation @KafkaListener on that method — indicating the topic name (topics = “provider” and group_id = “group_id”). Take note that you can have multiple topics declared which means that service can consume multiple objects from multiple topics.

In this case, it will only log the consumed message object.

6. Create Controller Class for the Rest Endpoint

controller

This is the driver class that will publish the ServiceProvider object into Kafka broker. The method sendMessageToKafkaTopic expects the object and it will just simply pass the object into the autowired service Producer.

7. Configure the application thru application.yaml

config

This is the interesting part, there are several items you just need to know:

a. boot-strap-servers specify your Kafka Broker IP address and port number

b. in producer, the keys key-serializer and value-serializer indicates how are we going to serialize the object we are going to send to Kafka. In this case I chose StringSerializer for key (since the key is a string – “providers”) and JsonSerializer for seializing the object.

c. in consumer, the keys key-deserializer and value-deserializer does the opposite: it deserializes the key or json-formatted object into Java object.

d. For above scenarios, you may choose to have different serialization and deserialization methods available for kafka.

e. the key value-deserializer will not work if you dont include the packages into trusted packages (this is akin to what type of object the kafka will convert the data to).

You may now test your application in your local machine and see if you can send the message in the java object format and receive the message in same.

In this scenario, I used Postman to send the rest endpoint request to send the ServiceProvider data into the rest service.

postman

And I checked the logs; I can see my Producer successfully sent the data into Kafka, and the same Consumer service receiving the object and printing it. 

logs

You may check the whole source code from my github repo:

Thank you for visiting my site! Feel free to comment on this post. If you have questions, you may comment or email me at me@jeremypunsalan.com.

Write a comment