Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Error: Invalid configuration value detected for fs.azure.account.key

I am using Azure Databricks to make a delta table in Azure Blob Storage using ADLS Gen2 but i am getting the error "Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key" on last line

%scala
spark.conf.set(
    "fs.azure.account.oauth2.client.secret",
    "<storage-account-access-key>")
friends = spark.read.csv('myfile/fakefriends-header.csv',
   inferSchema = True, header = True)
friends.write.format("delta").mode('overwrite')\
   .save("abfss://[email protected]/myfile/friends_new")

Please help me out how can i avoid this error

like image 374
Nabia Salman Avatar asked Jan 01 '26 13:01

Nabia Salman


2 Answers

Short answer - you can't use storage account access key to access data using the abfss protocol. You need to provide more configuration options if you want to use abfss - it's all described in documentation.

spark.conf.set(
  "fs.azure.account.auth.type.<storage-account-name>.dfs.core.windows.net", 
  "OAuth")
spark.conf.set(
  "fs.azure.account.oauth.provider.type.<storage-account-name>.dfs.core.windows.net", 
  "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set(
  "fs.azure.account.oauth2.client.id.<storage-account-name>.dfs.core.windows.net", 
  "<application-id>")
spark.conf.set(
  "fs.azure.account.oauth2.client.secret.<storage-account-name>.dfs.core.windows.net", 
  dbutils.secrets.get(scope="<scope-name>",key="<service-credential-key-name>"))
spark.conf.set(
  "fs.azure.account.oauth2.client.endpoint.<storage-account-name>.dfs.core.windows.net", 
  "https://login.microsoftonline.com/<directory-id>/oauth2/token")

Storage access key could be used only when you're using wasbs, but it's not recommended to do with ADLSGen2.

P.S. You can also use passthrough cluster if you have permissions to access that storage account.

like image 74
Alex Ott Avatar answered Jan 04 '26 17:01

Alex Ott


a few months later but try with the following code in your notebook

spark._jsc.hadoopConfiguration().set("fs.azure.account.key.<account name>.dfs.core.windows.net",'<account key>')
like image 31
CMonte2 Avatar answered Jan 04 '26 18:01

CMonte2



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!