摘要: 1、spark-shell 启动设置动态分区 --executor-memory 16G \ --total-executor-cores 10 \ --executor-cores 10 \ --conf "spark.hadoop.hive.exec.dynamic.partition=true 阅读全文
posted @ 2021-07-08 09:18 yanzu 阅读(1023) 评论(0) 推荐(0)