Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to get default property values in Spark

I am using this version of Spark : spark-1.4.0-bin-hadoop2.6 . I want to check few default properties. So I gave the following statement in spark-shell

scala> sqlContext.getConf("spark.sql.hive.metastore.version")

I was expecting the call to method getConf to return a value of 0.13.1 as desribed in this link. But I got the below exception

java.util.NoSuchElementException: spark.sql.hive.metastore.version
    at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
    at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)

Am I retrieving the properties in the right way?

like image 270
Raj Avatar asked Jul 17 '15 03:07

Raj


2 Answers

You can use

sc.getConf.toDebugString

OR

sqlContext.getAllConfs

which will return all values that have been set, however some defaults are in the code. In your specific example, it is indeed in the code:

getConf(HIVE_METASTORE_VERSION, hiveExecutionVersion)

where the default is indeed in the code:

val hiveExecutionVersion: String = "0.13.1"

So, getConf will attempt to pull the metastore version from the config, falling back to a default, but this is not listed in the conf itself.

like image 153
Justin Pihony Avatar answered Oct 04 '22 00:10

Justin Pihony


In Spark 2.x.x If I wanted to know default value of a Spark Conf I would do this:

Below command will return a Scala Map in spark-shell.

spark.sqlContext.getAllConfs 

To find our value for a conf property:

e.g. - To find the default warehouse dir used by spark set to conf - spark.sql.warehouse.dir:

spark.sqlContext.getAllConfs.get("spark.sql.warehouse.dir")
like image 35
praveenak Avatar answered Oct 03 '22 22:10

praveenak