Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Intermittent HTTP error when loading files from ADLS Gen2 in Azure Databricks

I am getting an intermittent HTTP error when I try to load the contents of files in Azure Databricks from ADLS Gen2. The storage account has been mounted using a service principal associated with Databricks and has been given Storage Blob Data Contributor access through RBAC on the data lake storage account. A sample statement to load is

df = spark.read.format("orc").load("dbfs:/mnt/{storageaccount}/{filesystem}/{filename}")

The error message I get is:

Py4JJavaError: An error occurred while calling o214.load. : java.io.IOException: GET https://{storageaccount}.dfs.core.windows.net/{filesystem}/{filename}?timeout=90 StatusCode=412 StatusDescription=The condition specified using HTTP conditional header(s) is not met.
ErrorCode=ConditionNotMet ErrorMessage=The condition specified using HTTP conditional header(s) is not met.
RequestId:51fbfff7-d01f-002b-49aa-4c89d5000000
Time:2019-08-06T22:55:14.5585584Z

This error is not with all the files in the filesystem. I can load most of the files. The error is just with some of the files. Not sure what the issue is here. Any help will be appreciated.

like image 882
Amit Sukralia Avatar asked Nov 06 '22 14:11

Amit Sukralia


1 Answers

This has been resolved now. The underlying issue was due to a change at Microsoft end. This is the RCA I got from Microsoft Support:

There was a storage configuration that is turned on incorrectly during the latest storage tenant upgrade. This type of error would only show up for the namespace enabled account on the latest upgraded tenant. The mitigation for this issue is to turn off the configuration on the specific tenant, and we had kicked off the super sonic configuration rollout for the all the tenants. We have since added additional Storage upgrade validation for ADLS Gen 2 to help cover this type of scenario.

like image 67
Amit Sukralia Avatar answered Nov 28 '22 10:11

Amit Sukralia