hive> alter table my_table_name set location "hdfs://nameservice1/foo";
OK
Time taken: 0.173 seconds
hive> alter table my_table_name set location "hdfs://nameservice1/foo/bar";
Authorization failed:org.apache.hadoop.security.AccessControlException: action WRITE not permitted on path hdfs://nameservice1/foo for user hadoop_user. Use show grant to get more details.
As seen in the above screen output, the alter table location is working exactly once on the external table and subsequently it is throwing an error. Please advice how I could get the alter table location statement to work.
Step 1: Describe the database student to see its parent-directory. By default, hive stores its data at /user/hive/warehouse on HDFS. DESCRIBE DATABASE EXTENDED student; Step 2: Use ALTER to change the parent-directory location (NOTE: /hive_db is the available directory on my HDFS ).
Long story short: the location of a hive managed table is just metadata, if you update it hive will not find its data anymore. You do need to physically move the data on hdfs yourself.
Yes, you can do it by using the clause – LOCATION '<hdfs_path>' we can change the default location of a managed table.
I figured out the error and fixed it. The issue was that during the creation of the table i set its location to a non existent path on hdfs. So when i was trying to alter its location,it wasn't allowing me to do so.
The resolution: I first created the directory to which the table was currently pointed to and then created the directory to which i wanted to point the table to. Then the alter table location statement worked as required.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With