Sending and receiving data of Kafka
First, configure the port number and the topic should be
Second, load the configured producers and consumers in spring bean.xml
As follows:
Consumer: consumer configuration
<?xml version="1.0" encoding="UTF-8"?>
<bean id="consumerProperties" class="java.util.HashMap">
<constructor-arg&g ...
Posted by francoisp on Sun, 02 Feb 2020 18:44:41 +0100
Kafka integrates Java API
1, Development preparation
First of all, after setting up the kafka (version 1.0.0) environment, the development language used here is Java, the build tool Maven.
Maven relies on the following:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSche ...
Posted by chrispols on Thu, 30 Jan 2020 07:46:14 +0100
Flink uses Broadcast State to realize real-time update of streaming configuration
Source link: http://shiyanjun.cn/archives/1857.html , thanks for sharing.
Broadcast State is an Operator State supported by Flink. Using Broadcast State, you can input data records in a Stream of Flink program, and then broadcast these data records to each Task downstream, so that these data records ...
Posted by ryan.od on Tue, 21 Jan 2020 07:03:21 +0100
Using Kafka under Windows
1, Prepare ZooKeeper
Download the latest stable version from the official website: 3.5.6 . After downloading, extract it to the local location, find zoo_sample.cfg in the conf directory, rename it to zoo.cfg, and modify the data storage directory in it:
#dataDir=/tmp/zookeeper
dataDir=D:/hecg/apache-zookeeper-3.5.6-bin/data
Start with git com ...
Posted by daverules on Mon, 20 Jan 2020 18:22:24 +0100
Spark Big Data-Spark+Kafka Build Real-Time Analysis Dashboard
Spark+Kafka Build Real-Time Analysis Dashboard
I. Framework
Spark+Kafka is used to analyze the number of male and female students shopping per second in real time, Spark Streaming is used to process the user shopping log in real time, then websocket is used to push the data to the browser in real ti ...
Posted by t31os on Fri, 17 Jan 2020 03:40:05 +0100
. NetCore uses BlockingCollection to implement simple message queuing
Nowadays, the application scenarios of message queuing are more and more large. RabbmitMQ and KafKa are commonly used.
We use BlockingCollection to implement a simple message queue.
Implement message queuing
Create a console application with Vs2017. Create DemoQueueBlock class to encapsulate some common judgments.
HasEle, judge wh ...
Posted by plsanders on Mon, 06 Jan 2020 22:43:54 +0100
[original by HAVENT] spring boot + spring Kafka asynchronous configuration
Recently, our project team used Kafka to manage the system log uniformly, but the Kafka cluster (3 servers) was hung up due to the unexpected disaster, which was comparable to the rhythm of winning the prize. Then, all the services using Kafka to send message log were stuck. After investigation, it was found that Kafka crashed and caused the ca ...
Posted by manchesterkid on Mon, 06 Jan 2020 09:51:35 +0100
winlogbeat - collect windows event log and enable default template and dashboard related configuration
winlogbeat is used to collect the system event log of windows;
Official website installation method: https://www.elastic.co/guide/en/beats/winlogbeat/current/winlogbeat-installation.html
Collect and write the elasticsearch configuration instance:
winlogbeat.event_logs:
- name: Security
ignore_older: 24h
event_id: 4624, 4625,4626,4627 ...
Posted by Seraskier on Mon, 02 Dec 2019 03:09:08 +0100
Some brief introduction of using Kafka: 1 clustering 2 Principle 3 terminology
[TOC]
Section I Kafka cluster
Before inheriting
If you are a developer and are not interested in building kafka cluster, you can skip this chapter and look at tomorrow's content directly
If you think it's no harm to know more, please keep reading.
As a reminder, there are many figures in this chapter
Kafka cluster construction
Summary
The ...
Posted by swallace on Thu, 28 Nov 2019 12:09:26 +0100
Spring Cloud Bus custom event stepping
This paper is based on Spring Cloud Greenwich.SR3, Spring Boot 2.1.10.RELEASE
By chance, I found that Spring Cloud also has this component. By combining with Message Oriented Middleware (RabbitMQ or Kafka), the published Spring events can be passed to other JVM projects. It feels very good, so I wrote a demo to try. The code can Reference here ...
Posted by psychotomus on Fri, 15 Nov 2019 11:52:20 +0100