SparkException——Dynamic partition strict mode 问题解决
问题场景
在spark-shell控制台,运行testDF.write.mode("append").partitionBy("dt").saveAsTable("t_pgw_base_statistics_final_dy_test");,提示org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict
解决办法
设置参数,就可以存入了。
sqlContext.setConf("hive.exec.dynamic.partition.mode","nonstrict");
浙公网安备 33010602011771号