Install Elastics rearch + logstash + elastic search-analysis-ik under unbuntu

Posted by mewhocorrupts on Tue, 04 Jun 2019 01:52:30 +0200

1. Install elastic search

The first time to use apt-get installation method, should be the software source is not set to the latest, the result of the installation version of 1.7x, decisive deletion.

For the second time, download and install the zip package of elastic search directly.

  1. wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-5.5.1.zip
  2. unzip elasticsearch-5.5.1.zip
  3. Configuration. Open the cluster.name annotation of elastic search. yml, you can modify the name according to the situation. Open the node.name annotation. If you need ip access other than the current host, you need to open the annotation of network.host, and change the value to 0.0.0. You can also specify several fixed ip access. If you need to modify, you can use the number to separate the different ips. The default port, you need to open the comment for http.port and modify the following values
  4. Once the configuration is complete, you can call elastic search in the bin directory to start.

2. Install logstash

> Use logstash to automatically synchronize table data in mysq to elastic search

Installation in the apt-get format given in the official documentation

  1. Download and install the Public Signing Key: > wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
  2. You may need to install the apt-transport-https package on Debian before proceeding: > sudo apt-get install apt-transport-https
  3. Save the repository definition to /etc/apt/sources.list.d/elastic-5.x.list: > echo "deb https://artifacts.elastic.co/packages/5.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-5.x.list
  4. Finally, you can install logstash version 5.5.1 > sudo apt-get update && sudo apt-get install logstash

3. JDBC plug-in logstash-input-jdbc installed with logstash

> Because logstash-input-jdbc is developed using ruby, you need to modify the source of gem, otherwise it will not move when requesting resources during installation, because some dependencies are placed on Amazon.

  1. Modify Gemfile data source under logstash > 1. Modify the source value in Gemfile to https://ruby.taobao.org
    1. Modify the second remote value under Gemfile.jruby-1.o.lock to https://ruby.taobao.org
  2. Install logstash-input-jdbc > Go to the root directory of logstash and execute bin/logstash-plugin install logstash-input-jdbc. After a while, the installation will succeed.
  3. To configure > 1. Create folders under logstash. logstash_jdbc_xxx can be built according to its own needs.
    1. Download mysql-connector-java-5.1.38.jar and put it under the folder you just created
    2. Write the configuration file jdbc.conf

jdbc.conf:

input{
    stdin {
    }
    jdbc {
        jdbc_connection_string => "jdbc:mysql://192.168.2.19:3306/survey_acquisition_faq"
        jdbc_user => "app-survey_acquisition"
        jdbc_password => "WyMrCPQB9T5YzuSe"
            # the path to our downloaded jdbc driver
            jdbc_driver_library => "/usr/share/logstash/bin/logstash_jdbc_faq/mysql-connector-java-5.1.38.jar"
             # the name of the driver class for mysql
            jdbc_driver_class => "com.mysql.jdbc.Driver"
            jdbc_paging_enabled => true
            jdbc_page_size => "50000"
            statement_filepath => "/usr/share/logstash/bin/logstash_jdbc_faq/sql/t_help_document.sql"
            schedule => "* * * * *"
            type => "faq_help_document"
        record_last_run => true
        use_column_value => true
        tracking_column => "update_time"
        clean_run => true
    }

    jdbc {
                jdbc_connection_string => "jdbc:mysql://192.168.2.19:3306/survey_acquisition_faq"
                jdbc_user => "app-survey_acquisition"
                jdbc_password => "WyMrCPQB9T5YzuSe"
                # the path to our downloaded jdbc driver
                jdbc_driver_library => "/usr/share/logstash/bin/logstash_jdbc_faq/mysql-connector-java-5.1.38.jar"
                 # the name of the driver class for mysql
                jdbc_driver_class => "com.mysql.jdbc.Driver"
                jdbc_paging_enabled => true
                jdbc_page_size => "50000"
                statement_filepath => "/usr/share/logstash/bin/logstash_jdbc_faq/sql/t_question.sql"
                schedule => "* * * * *"
                type => "faq_help_question"
                record_last_run => true
                use_column_value => true
                tracking_column => "update_time"
        clean_run => true
        }

    jdbc {
                jdbc_connection_string => "jdbc:mysql://192.168.2.19:3306/survey_acquisition_faq"
                jdbc_user => "app-survey_acquisition"
                jdbc_password => "WyMrCPQB9T5YzuSe"
                # the path to our downloaded jdbc driver
                jdbc_driver_library => "/usr/share/logstash/bin/logstash_jdbc_faq/mysql-connector-java-5.1.38.jar"
                 # the name of the driver class for mysql
                jdbc_driver_class => "com.mysql.jdbc.Driver"
                jdbc_paging_enabled => true
                jdbc_page_size => "50000"
                statement_filepath => "/usr/share/logstash/bin/logstash_jdbc_faq/sql/t_video.sql"
                schedule => "* * * * *"
                type => "faq_help_video"
                record_last_run => true
                use_column_value => true
                tracking_column => "update_time"
        clean_run => true
        }
}

filter {
    json {
        source => "message"
        remove_field => ["message"]
    }
}

output {
    elasticsearch {
        hosts => "127.0.0.1:9200"
        index => "survey-faq"
        document_id => "%{id}"
        template_overwrite => true
        template => "/usr/share/logstash/template/logstash.json"
    }
    stdout {
        codec => json_lines
    }
}

> Template_overwrite and tempalte are two attributes used to define the segmentation template. The template defined is the ik participle template. If ik is not used, the two attributes can be deleted. To configure the ik participle, the following steps are needed to install the ik participle plug-in before using it.

  1. Create a folder sql under logstash_jdbc_xxx directory to place sql files. > Create an xx.sql file in the sql directory with the sql statements written in it. For example, select * from tablename
  2. Restarting logstash will import the established table data into elastic search

4. Install ik participle

Go to the root directory of elastic search and execute

bin/elasticsearch-plugin install https://github.com/medcl/elasticsearch-analysis-ik/releases/download/v5.5.1/elasticsearch-analysis-ik-5.5.1.zip

Waiting for installation to complete

Restart elastic search

Topics: JDBC MySQL SQL ElasticSearch