The installation and principles of kafka and zookeeper were mentioned in another blog post. I won't talk about this. I'll just talk about how to integrate kafka in spring boot project.
In this article, I will mainly talk about two ways:
Method 1: We use spring-integration-kafka, a plug-in of Kafka integration in spring. We mainly talk about how spring integrates the framework into use.
The basic dependency packages for spring boot can be seen in the previous two blogs. Let's add a dependency package for this integration framework.
Reference resources:
http://www.cnblogs.com/yuanermen/p/5453339.html
http://www.cnblogs.com/yujinghui/p/5424706.html
http://blog.csdn.net/molingduzun123/article/details/51785141
1. Adding dependency packages
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-kafka</artifactId>
<version>1.3.0.RELEASE</version>
</dependency>
2. Add spring-kafka-consumer.xml
New file spring-kafka-consumer.xml under src/main/resources;
The contents are:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-kafka="http://www.springframework.org/schema/integration/kafka"
xmlns:task="http://www.springframework.org/schema/task"
xsi:schemaLocation="http://www.springframework.org/schema/integration/kafka http://www.springframework.org/schema/integration/kafka/spring-integration-kafka.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task.xsd">
<int:channel id="inputFromKafka">
<int:queue/>
</int:channel>
<!--<int:service-activator auto-startup="true"-->
<!--input-channel="inputFromKafka" ref="kafkaConsumerService" method="receiveMessage">-->
<!--</int:service-activator>-->
<!-- ʏЂ}ז·½ʽ¶¼¿ʒҠ-->
<!-- ʹԃkafkaConsumerService4½ԊԫafkaлϢ -->
<bean id="kafkaConsumerService" class="com.oscar.kafkaTest.service.impl.KafkaConsumerServiceImpl" />
<int:outbound-channel-adapter channel="inputFromKafka"
ref="kafkaConsumerService" method="processMessage" auto-startup="true"/>
<int:poller default="true" id="default" fixed-rate="5"
time-unit="MILLISECONDS" max-messages-per-poll="5">
</int:poller>
<int-kafka:inbound-channel-adapter
kafka-consumer-context-ref="consumerContext" channel="inputFromKafka">
</int-kafka:inbound-channel-adapter>
<bean id="consumerProperties" class="org.springframework.beans.factory.config.PropertiesFactoryBean">
<property name="properties">
<props>
<prop key="auto.offset.reset">smallest</prop>
<prop key="socket.receive.buffer.bytes">10485760</prop>
<!-- 10M -->
<prop key="fetch.message.max.bytes">5242880</prop>
<prop key="auto.commit.interval.ms">1000</prop>
</props>
</property>
</bean>
<!-- agent_log/msg_log -->
<int-kafka:consumer-context id="consumerContext"
consumer-timeout="4000" zookeeper-connect="zookeeperConnect"
consumer-properties="consumerProperties">
<int-kafka:consumer-configurations>
<int-kafka:consumer-configuration
group-id="log-monitor" max-messages="500">
<int-kafka:topic id="agent-log" streams="4"/>
<int-kafka:topic id="msg-log" streams="4"/>
<int-kafka:topic id="task-log" streams="4"/>
</int-kafka:consumer-configuration>
</int-kafka:consumer-configurations>
</int-kafka:consumer-context>
<int-kafka:zookeeper-connect id="zookeeperConnect"
zk-connect="192.168.XX.XX:3181" zk-connection-timeout="6000"
zk-session-timeout="400" zk-sync-time="200"/>
</beans>
We can change our own information in the configuration file, mainly in two places
<bean id="kafkaConsumerService" class="com.oscar.kafkaTest.service.impl.KafkaConsumerServiceImpl" />
This corresponds to the way you write to process messages, which is defined.
@Service
public class KafkaConsumerServiceImpl{
public void processMessage(Map<String, Map<Integer, List<byte[]>>> msgs) {
System.out.println(msgs);
}
public void process(String message) {
System.out.println(message);
}
}
Pay attention to the receiving format;
Another place is to change the ip
3. File injection
Add annotations to application
@Configuration
@ImportResource(locations={"classpath:spring-kafka-consumer.xml"})
Method two
Default configuration method using spring boot directly
1. Add:
spring.kafka.bootstrap-servers=192.168.101.16:9092
spring.kafka.templated.default-topic=test
spring.kafka.consumer.group-id=default3
#spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer
#spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer
#spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
#spring.kafka.producer.value-serizlizer=org.apache.kafka.common.serialization.StringSerializer
2. Adding processing methods:
@Component
public class Receiver {
@KafkaListener(topics="test")
public void processMessage(String msgs){
System.out.println("aa");
System.out.println(msgs);
}
}
Be accomplished;
Sprboot is still in development. Some plug-ins are not integrated into it. Like the second one, it has been tried many times before it succeeds, mainly because of the high restrictions on versions.
http://blog.csdn.net/aa3313322122/article/details/70225647
http://www.cnblogs.com/xiaojf/p/6613559.html
Specific restrictions can be seen in these two blogs. It's clear that this is true.
Download address of this project: http://download.csdn.net/download/u013289746/9951632