SparkException——Dynamic partition strict mode 问题解决

问题场景

在spark-shell控制台,运行testDF.write.mode("append").partitionBy("dt").saveAsTable("t_pgw_base_statistics_final_dy_test");,提示org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict

解决办法

设置参数,就可以存入了。

sqlContext.setConf("hive.exec.dynamic.partition.mode","nonstrict");

posted on 2022-11-29 18:40  枫夜求索阁  阅读(378)  评论(0)    收藏  举报

导航