I would like to import a csv file (sqlserver-dba-csv.txt) into a table on a SQL Server database hosted in Azure.
This file is located on a Azure File Service (location address: 'https://XXXXXXXXXXX.file.core.windows.net/XXXXXXXXXXX/sqlserver-dba-csv.txt') which is also a mapped drive on my local machine.
Eventually I would like this to be somewhat automated by a trigger, but for now I just want to import the data into a table on the SQL server to prove the process works.
the contents of this sqlserver-dba-csv.txt file are :
1,James Brown,blue
2,Prince,red
3,Rick James,yellow
The code I am using on SSMS is:
**--create a table
CREATE TABLE musicians_csv (
musician_id INT,
full_name VARCHAR(50),
colour VARCHAR(20)
)
GO
--bulk insert csv into a SQL Server table
BULK
INSERT musicians_csv
FROM 'https://XXXXXXXXXXX.file.core.windows.net/XXXXXXXXXXX/sqlserver-dba-csv.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Verify data inserted
SELECT *
FROM musicians_csv
GO
--Drop the table
DROP TABLE musicians_csv
GO**
The error message I receive is :
Msg 4861, Level 16, State 1, Line 10 Cannot bulk load because the file "https:/xxxxxxxxx.file.core.windows.net/xxxxxxxxx/sqlserver-dba-csv.txt" could not be opened. Operating system error code (null).
(0 row(s) affected)
I suspect that the formatting of the file location is incorrect but have not been able to find any solution after searching.
Alternatively - am I able to reference to and import a file on my local machine, even though my SQL Server is in the Azure Cloud - for example a location like: "C:\Users\user.name\Desktop\sqlserver-dba-csv.txt"
Any help greatly appreciated
You may be facing a security problem: Azure SQL may not have access to the file you're trying to import if the file has not been declared ok for public access.
Of course maybe you don't want to set the file to be available to everyone so you have to store the credential to access to it into Azure SQL via the CREATE CREDENTIAL command:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'A-$tr0ng|PaSSw0Rd!';
GO
CREATE DATABASE SCOPED CREDENTIAL [CSV-Storage-Credentials]
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = '<shared-access-signature for "csvimportdemo" blob storage here>';
GO
CREATE EXTERNAL DATA SOURCE [CSV-Storage]
WITH
(
TYPE = BLOB_STORAGE,
LOCATION = 'https://csvimportdemo.blob.core.windows.net',
CREDENTIAL= [CSV-Storage-Credentials]
);
once this is done you can then access the file via BULK command:
SELECT
FileId = ' + CAST(@fid AS NVARCHAR(9)) + ',
FirstName,
LastName,
TwitterHandle
FROM OPENROWSET(
BULK '<your file here>',
DATA_SOURCE = 'CSV-Storage',
FIRSTROW=2,
FORMATFILE='<your format file here>',
FORMATFILE_DATA_SOURCE = 'Storage') as t
you can find a full working sample on github (from where the sample code I posted are taken from), that automates the import also using an Azure Function:
https://github.com/yorek/AzureFunctionUploadToSQL/blob/master/SQL/create-objects.sql
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With