0%

在HUE中整合Oozie和Spark2并验证

HDFS上的数据在添加节点失败后,出现了很多块的损坏,得重新配置一遍

查看sharelib文件夹的位置

1
2
3
4
5
6
[root@master126 ~]# oozie admin -oozie http://master126:11000/oozie -sharelibupdate
[ShareLib update status]
sharelibDirOld = hdfs://master126:8020/user/oozie/share/lib/lib_20190521144826
host = http://master126:11000/oozie
sharelibDirNew = hdfs://master126:8020/user/oozie/share/lib/lib_20190521144826
status = Successful

创建文件目录

1
sudo -u oozie hdfs dfs -mkdir /user/oozie/share/lib/lib_20190521144826/spark2

向文件夹中添加Spark2需要的jar包

/opt/cloudera/parcels/SPARK2/lib/spark2/jars文件夹下的所有内容和

/opt/cloudera/parcels/CDH/lib/oozie/oozie-sharelib-yarn/lib/spark下面的oozie-sharelib-spark*.jar

在公司当前环境下,最终能凑齐的一共有293个jar文件,这边我下载下来打个包存在TIM里面,下次使用方便一些。

修改目录的所有者和权限

1
2
sudo -u hdfs hadoop fs –chown -R oozie:oozie /user/oozie/share/lib/lib_20170921070424/spark2
sudo -u hdfs hadoop fs –chmod -R 775 /user/oozie/share/lib/lib_20170921070424/spark2

更新并且确认

1
2
oozie admin -oozie http://master126:11000/oozie -sharelibupdate
oozie admin -oozie http://master126:11000/oozie -shareliblist

用样例验证

测试的时候别的没什么

properties要注意修改为

oozie.action.sharelib.for.spark spark2