spark-submit 开发适配
spark-submit 开发适配
搭建环境
启动 docker-compose up -d
访问地址 | 内容 |
---|---|
ip:8080 | spark集群界面 |
ip:8081 | |
ip:8082 | |
ip:18081 |
scs-spark-submit
调用 scs-spark-executer
操作
1、打包 scs-spark-executer
2、打包scs-spark-submit
3、将scs-spark-executer-1.0-jar-with-dependencies.jar和sparkExecuter.conf 拷贝到docker-compose.yml目录下的master1里面
修改参数 redis配置参数
userPrincipal=sparkuser
userKeytabPath=/opt/FIclient/user.keytab
krb5ConfPath=/opt/FIclient/KrbClient/kerberos/var/krb5kdc/krb5.conf
host=192.168.0.229
port=46379
password=hcloud&1234
db=9
4、将scs-spark-submit-1.0.0.jar拷贝到docker-compose.yml目录下的master1里面
修改参数
# 应用程序的mainClass
scsdm:spark-conf:minIdle: 1maxPoolSize: 80mainClass: cn.hancloud.scsdm.SparkExecuter# sparkHomesparkHome: /spark/# javahomejavaHome: /usr/lib/jvm/java-1.8-openjdk/# 应用程序jar包的存放位置,可以是本地或HDFSjarPath: /master1/scs-spark-executer-1.0-jar-with-dependencies.jar# 可以是Yarn或StandAlonemaster: spark://spark-master:7077# 可以是Cluster或ClientdeployMode: client# driver内存
# driverMemory:# executor内存
# executorMemory:# executor个数
# executorInstances:# executor核数
# executorCores:# 参数spark.default.parallelism的值
# defaultParallelism:# driver额外的JVM选项。如:GC设置或其他日志参数
# driverExtraJavaOptions:# 其它配置otherConf:- spark.driver.maxResultSize=4g- spark.executor.cores=3
5、进入master容器的/master目录
docker exec -it spark-master /bin/sh
cd /master
6、启动scs-spark-submit
java -jar spark-submit-1.0.0.jar
7、访问IP:8099/execSql
查看日志 类似下面
代码下载
博客地址