Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx--------- (on Linux)

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx--------

Hi, The following Spark code i was executing in Eclipse of CDH 5.8 & getting above RuntimeExeption

public static void main(String[] args) {
    final SparkConf sparkConf = new SparkConf().setMaster("local").setAppName("HiveConnector");
    final JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);
    SQLContext sqlContext = new HiveContext(sparkContext);

    DataFrame df = sqlContext.sql("SELECT * FROM test_hive_table1");
    //df.show();
    df.count();
 }

According to Exception /tmp/hive on HDFS should be writable, however we are executing spark job in local mode. That means there is no writable permission to the directory /tmp/hive in local (linux) file system, not HDFS.

So I had executed below command to gave permission.

$ sudo chmod -R 777 /tmp/hive

Now it is working for me.

If you are getting the same issue during execution of spark job in cluster mode you should configure below property in hive-site.xml file of hive conf folder and restart hive server.

  <property>
    <name>hive.exec.scratchdir</name>
    <value>/tmp/hive</value>
    <description>Scratch space for Hive jobs</description>
  </property>
  <property>
    <name>hive.scratch.dir.permission</name>
    <value>777</value>
    <description>The permission for the user-specific scratch directories that get created in the root scratch directory </description>
  </property>
like image 413
Prashant Sahoo Avatar asked Dec 18 '16 06:12

Prashant Sahoo


1 Answers

use proper 64bit winutils and set permission

winutils.exe chmod -R 777 \tmp\hive

 System.setProperty("hadoop.home.dir", "C:\\Users\\Hadoop_home")
  lazy val spark: SparkSession = {
    FileUtils.deleteDirectory(new File("c:\\tmp\\metastore_db"))
    FileUtils.deleteDirectory(new File("c:\\tmp\\spark-warehouse"))
    SparkSession.builder().config("spark.sql.warehouse.dir", "C:\\temp\\").master("local").appName("spark session for testing").enableHiveSupport().getOrCreate()
  }
like image 89
donald Avatar answered Dec 03 '22 08:12

donald