❮ Previous Lecture
Next Lecture ❯
Welcome to Spring Boot + Apache Kafka Tutorial series. In this lecture, we will configure Kafka Producer and Kafka Consumer in an application.properties file.
❮ Previous Lecture
Next Lecture ❯
Welcome to Spring Boot + Apache Kafka Tutorial series. In this lecture, we will configure Kafka Producer and Kafka Consumer in an application.properties file.
Lecture - #5 - Configure Kafka Producer and Consumer
Transcript:
Hi. Welcome back. In this lecture, we will configure the Producer and Consumer in our Spring Boot application. Well, Spring Boot simplifies a lot to configure Kafka Producer and Consumer in a Spring Boot application. Well, basically Spring Boot provides auto-configuration to configure Kafka Producer and Consumer in our Spring Boot application. it means we don't have to write a Java configuration code to configure Kafka Producer in Consumer. Well, let's head over to the Spring Boot Kafka official Documentation, and let's explore how Spring Boot will simplify configuring Kafka Producer and consumer. Well, let me just type spring Kafka documentation like this hit enter, and just go ahead and click on the first link that Spring for Apache Kafka. All right. And go to the introduction section over here and here you can see so this is the dependency that we have added in order to integrate Kafka in our spring boot application right So just go down and here you can see the getting started section. Well if we don't use spring boot then we need to write a lot of configuration code. For instance, just go down and here you can see without Spring Boot we have to configure a lot of things like we have to configure the Kafka template, we have to configure a Producer factory okay, so here we can see we need to configure the Consumer factory, we need setup Consumer properties. All right. So without Spring Boot basically we need to write all this kind of, you know, boilerplate Java configuration code, but Spring Boot simplifies all these things. Okay, so Spring Boot offers external properties that we can simply use in our application.properties file to configure Kafka for design consumer. Well, just remember without springboard we need to write a lot of configuration code to configure Kafka to configure Kafka Consumer and Producer For example, we need to configure Consumer factory bean, we need to configure in Kafka template. We need to configure or factory. All right. We need to create a lot of configuration code in order to configure Kafka Producer and Consumer if we don't use Spring boot Okay. But Spring Boot simplifies all this configuration. Spring boot by default configure all these beans for us we don't have to write all this much of code in order to configure Kafka Producer and Consumer in our our spring boot application. Okay, I hope you understood. Well, let's go back to IntelliJ IDEA and let's configure a Kafka Producer and Consumer. Well, let me go the IntelliJ idea here and let's go-to resources section and open the application.properties file. Within application.properties file, we are going to configure the Kafka broker server address as well as Consumer and producer a related configuration. Well, let's take the property spring.kafka.consumer So first will configure Consumer and then we'll configure are spring.kafka.consumer .bootstrap-servers So this property basically we use to configure all the Apache Kafka, you know, servers. Right now we have only one Apache Kafka broker which is running locally, right? So let's write the value as localhost:9092. Well, right now we have only one Kafka broker's service which is running on the local machine. But let's say if we have multiple Kafka broker services running on your cluster, then what you need to do is you need to mention all those Kafka brokers, you know, separated by a comma. For example localhost:9092, localhost:9091. Let's say if another Kafka broker is running on 9093 and you need to mention all these Kafka server addresses separated by comma here. All right now we have only one Kafka broker service which is running on port 9092. Next, we need to configure the Consumer group just type the property spring.kafka.consumer. group-id So basically we need to provide a Consumer group to which the Consumer is belongs to. For example, let's say myGroup. Well if you can head over to Kafka ecosystem diagram over here. Well, here we have a Consumer group. It consists of multiple consumers like consumer one, consumer two, consumer three. Right. So we have mention the Consumer Group I.D. in which the Consumer belongs to. So in our case, we are going to provide the unique group I.D.. that is myGroup. All right. I hope you understood the usage of this property. Right. So we need to provide the, you know, Consumer group I.D. in which the Consumer is belongs to. Next, we need to configure Offset for this Consumer. So just type the property spring.kafka.consumer.auto-offset-reset and then let's provide a value as earliest. Well, we need to specify this property because this property specifies what to do when there is a no initial offset in a Kafka or if the current Offset does not exist anymore on the server. And here we have provided a value earliest because it will automatically reset Offset to the earliest offset. and there are a couple of values as well like the latest, none and ahything but you know, most of the time we use earliest okay, earliest. so this will automatically reset the object to the earliest object. Next, we need to configure the Consumer key and the value Deserializer. Well, just type the properties spring.kafka.consumer.key-deserializer: org apache.com.common.serialization.StringDeserializer. Well, we are using StringDeserializer class from Kafka library to Deserializer key in a message. So similarly we need to use the Deserializer class to Deserializer a value in a message. Right So let me simply copy this property and let me paste it over here and let me change from key to value. All right. So let me simply copy this property and let me paste it over here and let me change from key to value in a message. Next, we'll configure a producer. Well, we are going to again use Spring Boot provided properties in order to configure Apache Kafka Producer Well, just have the property spring.kafka.producer.bootstrap-servers: localhost:9092. So for Producer also we're using the same Apache Kafka broker service, right which running on port 9092. Next, we need to configure the serializer a class for key and value in the Producer as well. Just type the property spring.kafka .producer .key-serializer : org.apache .kafka.common.serialization.StringSerializer We are going to use StringSerializer class in order to serialize the key in a message to simply hit enter. So I am going to copy this and paste it over here. In order to configure the StringSerializer class to serialize a value in a message right And instead of just, you know, type the value here. Now we have configured serializer classes to serialize key and value in a message. This is pretty much it. So this is how basically we configure Kafka Producer and Consumer in our Spring boot application using Spring Boot provided external properties. this is awesome, right? We don't have to write a lot of configuration code in order to configure Kafka Producer and Consumer in our Spring Boot application. Spring Boot will automatically provide a lot of autoconfiguration. So we have to just use the external properties in order to configure Kafka Producer and Consumer. All right, now let's go ahead and it's run the Spring Boot application here to verify whether this configuration works properly or not. So let me run the Spring Boot application. Let me go to the main entry point class and from here I am going to run the Spring Boot application and there we go. Spring Boot application is running on embedded server on port 8080 It means this configuration that is Kafka Producer and Consumer configuration in application.properties file is working as expected. All right, great. I will see you in the next lecture.
Comments
Post a Comment
Leave Comment