Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark SaveAsTable not allowed - Databricks throws UC not exist

Tags:

databricks

I am trying to SaveAsTable a table in a job cluster without UC and I still facing this problem of UC.

[UC_NOT_ENABLED] Unity Catalog is not enabled on this cluster error in Databricks cluster

My runtime: 11.3

When I use the same command in an interactive cluster, I receive no error. Bellow follow the same function that runs in interactive and not run in ephemeral cluster:

def _delta_write(write_mode: str,
             container: str,
             storage_account: str,
             destination_path: str,
             df: DataFrame,
             partition_columns: list,
             TableName: str):
    writer = df.write.format('delta')\
    .option('path',f"abfss://{container}@{storage_account}.dfs.core.windows.net/{destination_path}/data")\
    .mode(write_mode)\
    .partitionBy(partition_columns)\
    .saveAsTable(TableName)

I just want to save the table in hive_metastore, the traditional internal metastore from Databricks using a job and I can not activate UC because my account is not premium.

Any Tips?

like image 667
Raphael Castilho Gil Avatar asked Feb 13 '26 13:02

Raphael Castilho Gil


1 Answers

Ran into the same. Short: 99% your problem is that TableName contains 3-level name (catalog.schema.table), where you intended to have schema.table only.

like image 51
Kombajn zbożowy Avatar answered Feb 16 '26 00:02

Kombajn zbożowy



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!