hive opens the Kerberos beeline connection

1.kerberos installation kerberos installation configuration and use: https://blog.csdn.net/qq_21383435/article/details/83625252 2. Generate keytab Execute the following command on cdh1 node, KDC server node: cd /var/kerberos/krb5kdc/ kadmin.local -q "addprinc -randkey hive/cdh-server1@YONG.COM " kadmin.local -q "addprinc - ...

Posted by smpdawg on Tue, 10 Dec 2019 17:22:35 +0100

MapReduce program 2 of Maven project -- realize the function of counting the total salaries of employees in each department

prerequisite: 1. Install jdk1.8 (under Windows Environment) 2. Install maven 3.3.9 (under Windows Environment) 3. Install eclipse (under Windows Environment) 4. Install hadoop (under Linux environment) Question: The input file is: EMP.csv. The contents of the EMP.csv file are as follows: SAL is employee salary (int type), ...

Posted by tsiedsma on Tue, 10 Dec 2019 17:00:49 +0100

Alicloud builds big data platform: flume installation, deployment and testing

I. flume installation 1. decompress tar -zxvf flume-ng-1.6.0-cdh5.15.0.tar.gz -C /opt/modules/ 2. Change name mv apache-flume-1.6.0-cdh5.15.0-bin/ flume-1.6.0-cdh5.15.0-bin/ 3. Configuration file: flume-env.sh export JAVA_HOME=/opt/modules/jdk1.8.0_151 4. Test success bin/flume-ng version //Result: Flume 1.6.0-cdh5. ...

Posted by zahidraf on Tue, 10 Dec 2019 04:09:34 +0100

[Hadoop cluster building] ssh password free login setting

[Hadoop cluster building] ssh password free login setting There are three hosts, their IP addresses and roles in the cluster are as follows: 172.17.0.2 //master 172.17.0.3 //slave1 172.17.0.4 //slave2 Now you want to make the two of them able to log in through ssh, the steps are as follows: Create public key and configure acc ...

Posted by pskZero7 on Mon, 09 Dec 2019 12:24:20 +0100

Hadoop configuration under ECS

Article directory Cluster SSH password free login settings Hadoop installation configuration Open port To configure Format HDFS (Master, Slave) Start Hadoop Verify installation succeeded In the previous article, we have discussed how to write hadoop under virtual machine. On this basis, due to the particularity of cloud ...

Posted by silverspy18 on Mon, 09 Dec 2019 05:49:02 +0100

Big data tutorial (8.4) mobile traffic analysis case

The implementation and principle of wordcount word statistics using mapreduce are shared before. This blogger will continue to share a classic case of mobile traffic analysis to help understand and use hadoop platform in practical work. I. requirements The following is a mobile traffic log. We need to analyze the upstream traffic, downstream ...

Posted by chantown on Fri, 06 Dec 2019 23:52:36 +0100

Advanced case of spark SQL

(1) case of ashes -- UDTF seeking wordcount Data format:Each line is a string and separated by spaces.Code implementation: object SparkSqlTest { def main(args: Array[String]): Unit = { //Block redundant logs Logger.getLogger("org.apache.hadoop").setLevel(Level.WARN) Logger.getLogger("org.apache.spark").setLevel(Leve ...

Posted by marklarah on Tue, 03 Dec 2019 04:36:38 +0100

Java + spark SQL query excel

Download Spark on Spark official website Spark Download The version is free. After downloading, extract it and put it under bigdata (the directory can be changed) Download the file winutils.exe required by Hadoop under Windows Let's find it on the Internet. It won't be uploaded here. In fact, this file is optional, and error reporting doesn' ...

Posted by GrayFox12 on Tue, 03 Dec 2019 04:21:44 +0100

Archlinux/Manjaro install MariaDB Hadoop Hive (pseudo distributed)

Hadoop 2.x.y (pseudo distributed) Refer to the single node setup section of the corresponding version of the official websitehttps://hadoop.apache.org/docs/ First, ssh and rsync Then download the bin package and extract it. Add the extracted root directory as the environment variable Hadoop? Home # example export HADOOP_HOME=/home/yzj/Applicat ...

Posted by beerman on Sat, 30 Nov 2019 22:12:43 +0100

High availability configuration of Hadoop distributed environment

The previous article introduced Hadoop distributed configuration , but designed to be highly available, this time use zookeeper to configure Hadoop highly available. 1. Environmental preparation 1) modify IP 2) modify the mapping of host name and host name and IP address 3) turn off the firewall 4) ssh password free login 5) create hado ...

Posted by eflopez on Tue, 26 Nov 2019 18:53:08 +0100