Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Ignoring non-spark config property: hive.exec.dynamic.partition.mode

How to run a Spark-shell with hive.exec.dynamic.partition.mode=nonstrict?

I try (as suggested here)

  export SPARK_MAJOR_VERSION=2; spark-shell  --conf "hive.exec.dynamic.partition.mode=nonstrict" --properties-file /opt/_myPath_/sparkShell.conf'

but Warning "Ignoring non-spark config property: hive.exec.dynamic.partition.mode=nonstrict"


PS: using Spark version 2.2.0.2.6.4.0-91, Scala version 2.11.8

NOTE

The demand arrives after error on df.write.mode("overwrite").insertInto("db.partitionedTable"),

org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict

like image 856
Peter Krauss Avatar asked Oct 30 '19 21:10

Peter Krauss


2 Answers

I had the same problem and only found the workaround to set the config directly in the process before writing like

spark.conf.set("hive.exec.dynamic.partition.mode", "nonstrict")
df.write(...)
like image 136
Paul Avatar answered Nov 16 '22 08:11

Paul


You can try using spark.hadoop.* prefix as suggested in Custom Spark Configuration section for version 2.3. Might work as well in 2.2 if it was just a doc bug :)

spark-shell \
  --conf "spark.hadoop.hive.exec.dynamic.partition=true" \
  --conf "spark.hadoop.hive.exec.dynamic.partition.mode=nonstrict" \
  ...
like image 9
mazaneicha Avatar answered Nov 16 '22 08:11

mazaneicha