Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Create External table in Azure databricks

I am new to azure databricks and trying to create an external table, pointing to Azure Data Lake Storage (ADLS) Gen-2 location.

From databricks notebook i have tried to set the spark configuration for ADLS access. Still i am unable to execute the DDL created.

Note: One solution working for me is mounting the ADLS account to cluster and then use the mount location in external table's DDL. But i needed to check if it is possible to create a external table DDL with ADLS path without mount location.

# Using Principal credentials
spark.conf.set("dfs.azure.account.auth.type", "OAuth")
spark.conf.set("dfs.azure.account.oauth.provider.type", "ClientCredential")
spark.conf.set("dfs.azure.account.oauth2.client.id", "client_id")
spark.conf.set("dfs.azure.account.oauth2.client.secret", "client_secret")
spark.conf.set("dfs.azure.account.oauth2.client.endpoint", 
"https://login.microsoftonline.com/tenant_id/oauth2/token")

DDL

create external table test(
id string,
name string
)
partitioned by (pt_batch_id bigint, pt_file_id integer)
STORED as parquet
location 'abfss://container@account_name.dfs.core.windows.net/dev/data/employee

Error Received

Error in SQL statement: AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.contracts.exceptions.ConfigurationPropertyNotFoundException Configuration property account_name.dfs.core.windows.net not found.);

I need help in knowing if this is possible to refer to ADLS location directly in DDL?

Thanks.

like image 202
anurag Avatar asked Jun 27 '19 13:06

anurag


1 Answers

You can perform this operation, once the Azure Data lake storage is confiruged.

You should create a mount point using the method described below, if you want all users in the Databricks workspace to have access to the mounted Azure Data Lake Storage Gen2 account. The service client that you use to access the Azure Data Lake Storage Gen2 account should be granted access only to that Azure Data Lake Storage Gen2 account; it should not be granted access to other resources in Azure.

Once a mount point is created through a cluster, users of that cluster can immediately access the mount point. To use the mount point in another running cluster, users must run dbutils.fs.refreshMounts() on that running cluster to make the newly created mount point available for use.

There are three primary ways of accessing Azure Data Lake Storage Gen2 from a Databricks cluster:

  1. Mounting an Azure Data Lake Storage Gen2 filesystem to DBFS using a service principal with delegated permissions and OAuth 2.0.
  2. Using a service principal directly.
  3. Using the Azure Data Lake Storage Gen2 storage account access key directly.

For more details, refer "Azure Data Lake Storage Gen2".

Hope this helps.

like image 127
CHEEKATLAPRADEEP-MSFT Avatar answered Oct 01 '22 03:10

CHEEKATLAPRADEEP-MSFT