I tried to install hive on a raspberry pi 2. I installed Hive by uncompress zipped Hive package and configure $HADOOP_HOME and $HIVE_HOME manually under hduser user-group I created. When running hive, I got the following error message: hive
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Exception in thread "main" java.lang.RuntimeException: Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql)
So I ran the command suggested in the above error message: schematool -dbType derby -initSchema I got the error message:
Error: FUNCTION 'NUCLEUS_ASCII' already exists. (state=X0Y68,code=30000) org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !! * schemaTool failed *
It seems there aren't any helpful information when I try to google this error online. Any help or any explanation on how Hive works with Derby would be appreciated!
There are 2 different ways to setup the metastore server and metastore database using different Hive configurations: Configuration options for metastore database where metadata is persisted: Local/Embedded Metastore Database (Derby) Remote Metastore Database.
What is Hive Metastore? Metastore is the central repository of Apache Hive metadata. It stores metadata for Hive tables (like their schema and location) and partitions in a relational database. It provides client access to this information by using metastore service API.
After installing hive, if the first thing you did was run hive, hive attempted to create/initialize the metastore_db, but apparently might not get it right. On that initial run, maybe you saw your error:
Exception in thread "main" java.lang.RuntimeException: Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql)
Running hive, even though it fails, creates a metastore_db directory in the directory from which you ran hive:
ubuntu15-laptop: ~ $>ls -l |grep meta drwxrwxr-x 5 testuser testuser 4096 Apr 14 12:44 metastore_db
So when you then tried running
ubuntu15-laptop: ~ $>schematool -initSchema -dbType derby
The metastore already existed, but not in complete form.
Soooooo the answer is:
Before you run hive for the first time, run
schematool -initSchema -dbType derby
If you already ran hive and then tried to initSchema and it's failing:
mv metastore_db metastore_db.tmp
Re run
schematool -initSchema -dbType derby
Run hive again
**Also of note: if you change directories, the metastore_db created above won't be found! I'm sure there's a good reason for this that I don't know yet because I'm literally trying to use hive for the first time today. Ahhh here's information on this: metastore_db created wherever I run Hive
I installed hive with HomeBrew(MacOS) at /usr/local/Cellar/hive and afer running schematool -dbType derby -initSchema
I get the following error message:
Starting metastore schema initialization to 2.0.0 Initialization script hive-schema-2.0.0.derby.sql Error: FUNCTION 'NUCLEUS_ASCII' already exists. (state=X0Y68,code=30000) org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
However, I can't find either metastore_db or metastore_db.tmp folder under install path, so I tried:
find /usr/ -name hive-schema-2.0.0.derby.sql
vi /usr/local/Cellar/hive/2.0.1/libexec/scripts/metastore/upgrade/derby/hive-schema-2.0.0.derby.sql
schematool -dbType derby -initSchema
, then everything goes well!hope this help someone.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With