I modified this part
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/user/hive/warehouse</value>
<description>location of default database for the warehouse</description>
</property>
of hive-default.xml.template
with my own path. When running hive, if I try to create a table it says it could create file://mypath/etc.. and it is still looking for /user/hive/warehouse
. Did I do something wrong? I tried to create hive-site.xml, and it does not seem to work either.
Configuring the Spark SQL Hive Warehouse DirectoryOn an Ambari cluster, select Spark2 > Configs, then use Add Property in "Custom spark2-defaults" and "Custom spark2-thrift-sparkconf" to add a spark. sql. warehouse. dir property with the value /apps/hive/warehouse .
Hive stores tables files by default at /user/hive/warehouse location on HDFS file system. You need to create these directories on HDFS before you use Hive. On this location, you can find the directories for all databases you create and subdirectories with the table name you use.
The location for external hive database is “/warehouse/tablespace/external/hive/” and the location for manage database is “/warehouse/tablespace/managed/hive”. In the older version of the hive, the hive database's default storage location is “/apps/hive/warehouse/”.
Change the warehouse path in hive-site.xml as follows:
<property>
<name>hive.metastore.warehouse.dir</name>
<value>Your_Path_HERE</value>
<description>location of default database for the warehouse</description>
</property>
Give the permission to <Your_Path_HERE>
directory if it is on local system
sudo chown -R user <Your_Path_HERE>
sudo chmod -R 777 <Your_Path_HERE>
if given path is on HDFS , then stop and start the hadoop services
stop-all.sh
start-all.sh
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With