文章目录
- Kerberos认证
-
- 环境说明
- 时间同步
- Kerberos部署
-
- 客户端安装(每个节点都要安装)
- 服务端安装(hadoop02节点)
- krb5.conf配置(每个节点都要配置)
- kdc.conf配置(仅hadoop02)
- acl配置(仅hadoop02)
- 初始化数据库(仅hadoop02)
- 启动Kerberos 相关服务(仅hadoop02)
- 创建 Kerberos 管理员用户和主体(仅hadoop02)
- 客户端节点测试服务
- Hadoop集成Kerberos
-
- core-site.xml(仅hadoop02)
- hdfs-site.xml(所有节点执行)
- yarn-site.xml(仅hadoop02)
- mapred-site.xml(仅hadoop02)
- 分发配置
- 配置 HDFS 使用 HTTPS 安全传输协议
-
- 1.生成密钥对
- 2.修改密钥权限
- 3.修改hadoop配置文件ssl-server.xml.example
- 4. 将该证书和配置分发到集群中的每台节点的相同路径
- 服务验证
Kerberos认证
环境说明
192.168.30.85 hadoop01
192.168.30.86 hadoop02
192.168.30.87 hadoop03
mkdir -p /opt/software
/opt/software/hadoop-3.3.1
时间同步
date
date -s "20230421 14:59:30"
Kerberos部署
rpm -qa | grep krb5
http://mirror.centos.org/centos/7/os/x86_64/Packages/libevent-2.0.21-4.el7.x86_64.rpm
http://mirror.centos.org/centos/7/os/x86_64/Packages/libverto-libevent-0.2.5-4.el7.x86_64.rpm
http://mirror.centos.org/centos/7/os/x86_64/Packages/words-3.0-22.el7.noarch.rpm
http://mirror.centos.org/centos/7/os/x86_64/Packages/libkadm5-1.15.1-50.el7.x86_64.rpm
http://mirror.centos.org/centos/7/os/x86_64/Packages/krb5-libs-1.15.1-50.el7.x86_64.rpm
http://mirror.centos.org/centos/7/os/x86_64/Packages/krb5-workstation-1.15.1-50.el7.x86_64.rpm
http://mirror.centos.org/centos/7/os/x86_64/Packages/krb5-server-1.15.1-50.el7.x86_64.rpm
客户端安装(每个节点都要安装)
rpm -ivh libevent-2.0.21-4.el7.x86_64.rpm
rpm -ivh libverto-libevent-0.2.5-4.el7.x86_64.rpm
rpm -ivh words-3.0-22.el7.noarch.rpm
rpm -ivh krb5-libs-1.15.1-50.el7.x86_64.rpm
rpm -ivh libkadm5-1.15.1-50.el7.x86_64.rpm
rpm -ivh krb5-workstation-1.15.1-50.el7.x86_64.rpm
服务端安装(hadoop02节点)
rpm -ivh krb5-server-1.15.1-50.el7.x86_64.rpm
krb5.conf配置(每个节点都要配置)
vim /etc/krb5.conf
scp -r /etc/krb5.conf root@hadoop01:/etc/
scp -r /etc/krb5.conf root@hadoop03:/etc/
# Configuration snippets may be placed in this directory as well
includedir /etc/krb5.conf.d/[logging]default = FILE:/var/log/krb5libs.logkdc = FILE:/var/log/krb5kdc.logadmin_server = FILE:/var/log/kadmind.log[libdefaults]dns_lookup_realm = falseticket_lifetime = 24hrenew_lifetime = 7dforwardable = truerdns = falsepkinit_anchors = FILE:/etc/pki/tls/certs/ca-bundle.crtdefault_realm = HADOOP.COM #default_ccache_name要注释,不然'hadoop fs -ls /'会报如下异常#org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]#default_ccache_name = KEYRING:persistent:%{uid}udp_preference_limit=1[realms]# 域名称HADOOP.COM = {kdc = hadoop02 # kdc分发中心,也就是kerberos服务器所在,hadoop02节点admin_server = hadoop02}# 如果匹配到example.com或者.example.com都会映射成HADOOP.COM
[domain_realm].example.com = HADOOP.COMexample.com = HADOOP.COM
kdc.conf配置(仅hadoop02)
vim /var/kerberos/krb5kdc/kdc.conf[kdcdefaults]
kdc_ports = 88
kdc_tcp_ports = 88[realms]
EXAMPLE.COM = {acl_file = /var/kerberos/krb5kdc/kadm5.acldict_file = /usr/share/dict/wordsadmin_keytab = /var/kerberos/krb5kdc/kadm5.keytabsupported_enctypes = aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal camellia256-cts:normal camellia128-cts:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal
acl配置(仅hadoop02)
vim /var/kerberos/krb5kdc/kadm5.acl
*/admin@HADOOP.COM *
初始化数据库(仅hadoop02)
kdb5_util create -r HADOOP.COM -s
ll /var/kerberos/krb5kdc/
kadm5.acl
kdc.conf
principal
principal.kadm5
principal.kadm5.lock
principal.ok
启动Kerberos 相关服务(仅hadoop02)
systemctl start krb5kdc
systemctl enable krb5kdc
systemctl start kadmin
systemctl enable kadmin
创建 Kerberos 管理员用户和主体(仅hadoop02)
kadmin.local -q "addprinc admin/admin@HADOOP.COM"
kadmin.local -q "addprinc -randkey test/test"
mkdir -p /opt/software/security/keytab
kadmin.local -q "xst -k /opt/software/security/keytab/test.keytab test/test"
chmod 770 /opt/software/security/keytab/
chmod 660 /opt/software/security/keytab/*
mkdir -p /opt/software/security/keytab
chmod 770 /opt/software/security/keytab/
scp -r /opt/software/security/keytab/* root@hadoop01:/opt/software/security/keytab/
scp -r /opt/software/security/keytab/* root@hadoop03:/opt/software/security/keytab/
客户端节点测试服务
kadmin.local
kadmin.local -q "listprincs"
kinit admin/admin
kadmin -p admin/admin
kinit -kt /opt/software/security/keytab/test.keytab test/test
Hadoop集成Kerberos
core-site.xml(仅hadoop02)
vim /opt/software/hadoop-3.3.1/etc/hadoop/core-site.xml
<property><name>hadoop.security.authentication</name><value>kerberos</value>
</property>
<property><name>hadoop.security.authorization</name><value>true</value>
</property>
hdfs-site.xml(所有节点执行)
vim /opt/software/hadoop-3.3.1/etc/hadoop/hdfs-site.xml
<property><name>dfs.namenode.kerberos.principal</name><value>test/test@HADOOP.COM</value>
</property>
<property><name>dfs.namenode.keytab.file</name><value>/opt/software/security/keytab/test.keytab</value>
</property>
<property><name>dfs.secondary.namenode.keytab.file</name><value>/opt/software/security/keytab/test.keytab</value>
</property>
<property><name>dfs.secondary.namenode.kerberos.principal</name><value>test/test@HADOOP.COM</value>
</property>
<property><name>dfs.block.access.token.enable</name><value>true</value>
</property>
<property><name>dfs.datanode.kerberos.principal</name><value>test/test@HADOOP.COM</value>
</property>
<property><name>dfs.datanode.keytab.file</name><value>/opt/software/security/keytab/test.keytab</value>
</property>
<property><name>dfs.data.transfer.protection</name><value>authentication</value>
</property>
yarn-site.xml(仅hadoop02)
vim /opt/software/hadoop-3.3.1/etc/hadoop/yarn-site.xml
<property><name>yarn.resourcemanager.principal</name><value>test/test@HADOOP.COM</value>
</property>
<property><name>yarn.resourcemanager.keytab</name><value>/opt/software/security/keytab/test.keytab</value>
</property>
<property><name>yarn.nodemanager.principal</name><value>test/test@HADOOP.COM</value>
</property>
<property><name>yarn.nodemanager.keytab</name><value>/opt/software/security/keytab/test.keytab</value>
</property>
mapred-site.xml(仅hadoop02)
vim /opt/software/hadoop-3.3.1/etc/hadoop/mapred-site.xml
<property><name>mapreduce.jobhistory.keytab</name><value>/opt/software/security/keytab/test.keytab</value>
</property>
<property><name>mapreduce.jobhistory.principal</name><value>test/test@HADOOP.COM</value>
</property>
分发配置
scp -r /opt/software/hadoop-3.3.1/etc/hadoop/core-site.xml root@hadoop01:/opt/software/hadoop-3.3.1/etc/hadoop/
scp -r /opt/software/hadoop-3.3.1/etc/hadoop/core-site.xml root@hadoop03:/opt/software/hadoop-3.3.1/etc/hadoop/scp -r /opt/software/hadoop-3.3.1/etc/hadoop/yarn-site.xml root@hadoop01:/opt/software/hadoop-3.3.1/etc/hadoop/
scp -r /opt/software/hadoop-3.3.1/etc/hadoop/yarn-site.xml root@hadoop03:/opt/software/hadoop-3.3.1/etc/hadoop/scp -r /opt/software/hadoop-3.3.1/etc/hadoop/mapred-site.xml root@hadoop01:/opt/software/hadoop-3.3.1/etc/hadoop/
scp -r /opt/software/hadoop-3.3.1/etc/hadoop/mapred-site.xml root@hadoop03:/opt/software/hadoop-3.3.1/etc/hadoop/
配置 HDFS 使用 HTTPS 安全传输协议
1.生成密钥对
mkdir -p /opt/software/security/https
keytool -keystore /opt/software/security/https/keystore -alias jetty -genkey -keyalg RSA输入密钥库口令: feisuan
再次输入新口令: feisuan
您的名字与姓氏是什么?[Unknown]:
您的组织单位名称是什么?[Unknown]:
您的组织名称是什么?[Unknown]:
您所在的城市或区域名称是什么?[Unknown]:
您所在的省/市/自治区名称是什么?[Unknown]:
该单位的双字母国家/地区代码是什么?[Unknown]:
CN=Unknown, OU=Unknown, O=Unknown, L=Unknown, ST=Unknown, C=Unknown是否正确?[否]: y输入 <jetty> 的密钥口令
(如果和密钥库口令相同, 按回车):
再次输入新口令:
keytool -keystore /opt/software/security/https/keystore -list
2.修改密钥权限
chmod 660 /opt/software/security/https/keystore
3.修改hadoop配置文件ssl-server.xml.example
cp /opt/software/hadoop-3.3.1/etc/hadoop/ssl-server.xml.example /opt/software/hadoop-3.3.1/etc/hadoop/ssl-server.xml
vim /opt/software/hadoop-3.3.1/etc/hadoop/ssl-server.xml
<property><name>ssl.server.keystore.location</name><value>/opt/software/security/https/keystore</value>
</property>
<property><name>ssl.server.keystore.password</name><value>feisuan</value>
</property>
<property><name>ssl.server.truststore.location</name><value>/opt/software/security/https/keystore</value>
</property>
<property><name>ssl.server.keystore.keypassword</name><value>feisuan</value>
</property>
<property><name>ssl.server.truststore.password</name><value>feisuan</value>
</property>
4. 将该证书和配置分发到集群中的每台节点的相同路径
mkdir -p /opt/software/security/https
scp -r /opt/software/security/https/keystore root@hadoop01:/opt/software/security/https
scp -r /opt/software/security/https/keystore root@hadoop03:/opt/software/security/https
scp -r /opt/software/hadoop-3.3.1/etc/hadoop/ssl-server.xml root@hadoop01:/opt/software/hadoop-3.3.1/etc/hadoop/
scp -r /opt/software/hadoop-3.3.1/etc/hadoop/ssl-server.xml root@hadoop02:/opt/software/hadoop-3.3.1/etc/hadoop/
服务验证
kinit -kt /opt/software/security/keytab/test.keytab test
klist
$HADOOP_HOME/bin/hadoop fs -ls /
kdestroy
$HADOOP_HOME/bin/hadoop fs -ls /