It seems that when I am connecting to Databricks Warehouse, it is using the default catalog which is hive_metastore.
Is there a way to define unity catalog to be the default?
I know I can run the query
USE CATALOG MAIN
And then the current session will use unity-catalog.
But I am looking for a way that it will always be the default.
So, if the first query after login we will be
CREATE SCHEMA IF NOT EXISTS MY_SCHEMA
The schema will be created inside the main catalog.
Three ways to set the default catalog
% databricks unity-catalog metastores assign --workspace-id 1234567890123456 \
--metastore-id 12a345b6-9999-9de3-3456-e789f0a12b34 \
--default-catalog-name my_catalog
spark.databricks.sql.initial.catalog.name my_catalog
add ConnCatalog=my_catalog to the JDBC connection URL.
[Databricks]
Driver=<path-to-driver>
Host=<server-hostname>
Port=443
HTTPPath=<http-path>
ThriftTransport=2
SSL=1
AuthMech=3
UID=token
PWD=<personal-access-token>
Catalog=my_catalog
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With