While connecting Spark(1.4.0 built for Hadoop 2.6.0) with hive(Version 1.1.0) tables exception is coming due to derby version mismatch.
Is there any way to both(spark & hive) can use same derby version ? ...Any other way to fix this ?
Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@e0d0b81, see the next exception for details.
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
... 88 more
Caused by: java.sql.SQLException: Database at /home/saurabh/Softwares/hive1.1/bin/metastore_db has an incompatible format with the current version of the software. The database was created by or upgraded by version 10.11.
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
... 85 more
Caused by: ERROR XSLAN: Database at /home/saurabh/Softwares/hive1.1/bin/metastore_db has an incompatible format with the current version of the software. The database was created by or upgraded by version 10.11.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.impl.store.raw.log.LogToFile.readControlFile(Unknown Source)
at org.apache.derby.impl.store.raw.log.LogToFile.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.bootLogFactory(Unknown Source)
at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.setRawStoreFactory(Unknown Source)
at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
... 85 more
15/07/01 17:29:49 INFO SparkUI: Stopped Spark web UI at http://192.168.17.155:4040
15/07/01 17:29:49 INFO DAGScheduler: Stopping DAGScheduler
This is how I solved it:
rm -rf "${APPDATA//\\//}/../Local/Temp/spark-"*
rm -rf /c/tmp/hive
rm -rf ./metastore_db
I deleted from git bash the:
I am not sure it is exactly applicable to your case, but it reset the derby database so that I can use it.
My spark is using derby-10.12.1.1.jar
and hive was using derby-10.10.2.0.jar
so went to /usr/local/spark/jars/
and copied derby-10.12.1.1.jar
and pasted in /usr/local/hive/lib
Above step solved my problem.
This is due to incompatibility in derby version between spark & hive. In my case, spark was having derby 10.12 while hive was using 10.14. I copied derby-10.14.1.0.jar from /hive/lib directory to spark/jars directory & issue got resolved.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With