Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Databricks change default catalog

It seems that when I am connecting to Databricks Warehouse, it is using the default catalog which is hive_metastore. Is there a way to define unity catalog to be the default?

I know I can run the query

USE CATALOG MAIN

And then the current session will use unity-catalog. But I am looking for a way that it will always be the default. So, if the first query after login we will be

CREATE SCHEMA IF NOT EXISTS MY_SCHEMA

The schema will be created inside the main catalog.

like image 598
Gilo Avatar asked Oct 21 '25 03:10

Gilo


1 Answers

Three ways to set the default catalog

Set workspace default catalog:

% databricks unity-catalog metastores assign --workspace-id 1234567890123456 \
                                             --metastore-id 12a345b6-9999-9de3-3456-e789f0a12b34 \
                                           --default-catalog-name my_catalog

Set cluster (job) default catalog:

spark.databricks.sql.initial.catalog.name my_catalog

Set BI Client default catalog:

JDBC:

add ConnCatalog=my_catalog to the JDBC connection URL.

ODBC:

[Databricks]
Driver=<path-to-driver>
Host=<server-hostname>
Port=443
HTTPPath=<http-path>
ThriftTransport=2
SSL=1
AuthMech=3
UID=token
PWD=<personal-access-token>
Catalog=my_catalog
like image 158
Douglas M Avatar answered Oct 25 '25 09:10

Douglas M



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!