I am using this version of Spark : spark-1.4.0-bin-hadoop2.6
. I want to check few default properties. So I gave the following statement in spark-shell
scala> sqlContext.getConf("spark.sql.hive.metastore.version")
I was expecting the call to method getConf
to return a value of 0.13.1
as desribed in this link. But I got the below exception
java.util.NoSuchElementException: spark.sql.hive.metastore.version
at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
Am I retrieving the properties in the right way?
You can use
sc.getConf.toDebugString
OR
sqlContext.getAllConfs
which will return all values that have been set, however some defaults are in the code. In your specific example, it is indeed in the code:
getConf(HIVE_METASTORE_VERSION, hiveExecutionVersion)
where the default is indeed in the code:
val hiveExecutionVersion: String = "0.13.1"
So, getConf
will attempt to pull the metastore version from the config, falling back to a default, but this is not listed in the conf itself.
In Spark 2.x.x If I wanted to know default value of a Spark Conf I would do this:
Below command will return a Scala Map in spark-shell.
spark.sqlContext.getAllConfs
To find our value for a conf property:
e.g. - To find the default warehouse dir used by spark set to conf - spark.sql.warehouse.dir:
spark.sqlContext.getAllConfs.get("spark.sql.warehouse.dir")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With