1.kerberos installation
kerberos installation configuration and use: https://blog.csdn.net/qq_21383435/article/details/83625252
2. Generate keytab
Execute the following command on cdh1 node, KDC server node:
cd /var/kerberos/krb5kdc/ kadmin.local -q "addprinc -randkey hive/cdh-server1@YONG.COM " kadmin.local -q "addprinc -randkey hive/cdh-server2@YONG.COM " kadmin.local -q "addprinc -randkey hive/cdh-server3@YONG.COM " kadmin.local -q "xst -k hive.keytab hive/cdh-server1@YONG.COM " kadmin.local -q "xst -k hive.keytab hive/cdh-server2@YONG.COM " kadmin.local -q "xst -k hive.keytab hive/cdh-server23@YONG.COM "
Copy the hive.keytab file to the / etc/hive/conf directory of other nodes
$ scp hive.keytab cdh1:/etc/hive/conf $ scp hive.keytab cdh2:/etc/hive/conf $ scp hive.keytab cdh3:/etc/hive/conf
And set permissions to execute on cdh1, cdh2 and cdh3 respectively:
$ ssh cdh1 "cd /etc/hive/conf/;chown hive:hadoop hive.keytab ;chmod 400 *.keytab" $ ssh cdh2 "cd /etc/hive/conf/;chown hive:hadoop hive.keytab ;chmod 400 *.keytab" $ ssh cdh3 "cd /etc/hive/conf/;chown hive:hadoop hive.keytab ;chmod 400 *.keytab"
Because the keytab is equivalent to having permanent credentials, it does not need to provide a password (if you modify the password of the principle in kdc, the keytab will be invalid). Therefore, if other users have read permission to the file, they can access hadoop as the user identity specified in the keytab. Therefore, the keytab file needs to ensure that only the owner has read permission (0400)
Configure hive-site.xml
<property> <name>hive.server2.authentication</name> <value>kerberos</value> </property> <property> <name>hive.metastore.kerberos.principal</name> <value>hive/_HOST@YONG.COM</value> </property> <property> <name>hive.server2.authentication.kerberos.principal</name> <value>hive/cdh-server2@YONG.COM</value> <!--For example, fill in here: hive/cdh-server2@YONG.COM Corresponding beeline Sentence principal=hive/cdh-server2@YONG.COM --> </property> <property> <name>hive.server2.authentication.kerberos.keytab</name> <value>/etc/hive/conf/hive.keytab</value> </property> <property> <name>hive.metastore.sasl.enabled</name> <value>true</value> </property> <property> <name>hive.metastore.kerberos.keytab.file</name> <value>/etc/hive/conf/hive.keytab</value> </property> <property> <name>hive.metastore.kerberos.principal</name> <value>hive/_HOST@YONG.COM</value> </property>
Final beeline link statement:
!connect jdbc:hive2://localhost:10001/default;principal=my_hive/test@HADOOP.TEST.COM mysql_user mysql_passwd
Add in core-site.xml:
<property> <name>hadoop.proxyuser.hive.hosts</name> <value>*</value> </property> <property> <name>hadoop.proxyuser.hive.groups</name> <value>*</value> </property> <property> <name>hadoop.proxyuser.hdfs.hosts</name> <value>*</value> </property> <property> <name>hadoop.proxyuser.hdfs.groups</name> <value>*</value> </property> <property> <name>hadoop.proxyuser.HTTP.hosts</name> <value>*</value> </property> <property> <name>hadoop.proxyuser.HTTP.groups</name> <value>*</value> </property>
Reference resources: https://blog.csdn.net/a118170653/article/details/43448133