Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Import CSV file into SQL Server

People also ask

How do I import CSV data into SQL server using SSIS?

Double click on this source and select the “Student CSV File” connection manager. Click on Columns on the left side of the screen to review the columns in the file. Click OK. Then drag the “OLE DB Destination” from the SSIS Toolbox to the “Data Flow” window and rename it as “SQL Table”.

Can SQL read CSV file?

One way to read flat files with Enzo is to connect to Enzo directly from SQL Server Management Studio (SSMS) and use the SELECT command on the CSV.data@generic table. CSV is the name of the adapter running in Enzo, and data is the table name of the method that can read flat files.


Based SQL Server CSV Import

1) The CSV file data may have , (comma) in between (Ex: description), so how can I make import handling these data?

Solution

If you're using , (comma) as a delimiter, then there is no way to differentiate between a comma as a field terminator and a comma in your data. I would use a different FIELDTERMINATOR like ||. Code would look like and this will handle comma and single slash perfectly.

2) If the client create the csv from excel then the data that have comma are enclosed within " ... " (double quotes) [as the below example] so how do the import can handle this?

Solution

If you're using BULK insert then there is no way to handle double quotes, data will be inserted with double quotes into rows. after inserting the data into table you could replace those double quotes with ''.

update table
set columnhavingdoublequotes = replace(columnhavingdoublequotes,'"','')

3) How do we track if some rows have bad data, which import skips? (does import skips rows that are not importable)?

Solution

To handle rows which aren't loaded into table because of invalid data or format, could be handle using ERRORFILE property, specify the error file name, it will write the rows having error to error file. code should look like.

BULK INSERT SchoolsTemp
    FROM 'C:\CSVData\Schools.csv'
    WITH
    (
    FIRSTROW = 2,
    FIELDTERMINATOR = ',',  --CSV field delimiter
    ROWTERMINATOR = '\n',   --Use to shift the control to next row
    ERRORFILE = 'C:\CSVDATA\SchoolsErrorRows.csv',
    TABLOCK
    )

From How to import a CSV file into a database using SQL Server Management Studio, from 2013-11-05:

First create a table in your database into which you will be importing the CSV file. After the table is created:

  • Log into your database using SQL Server Management Studio

  • Right click on your database and select Tasks -> Import Data...

  • Click the Next > button

  • For the Data Source, select Flat File Source. Then use the Browse button to select the CSV file. Spend some time configuring how you want the data to be imported before clicking on the Next > button.

  • For the Destination, select the correct database provider (e.g. for SQL Server 2012, you can use SQL Server Native Client 11.0). Enter the Server name; Check Use SQL Server Authentication, enter the User name, Password, and Database before clicking on the Next > button.

  • On the Select Source Tables and Views window, you can Edit Mappings before clicking on the Next > button.

  • Check the Run immediately check box and click on the Next > button.

  • Click on the Finish button to run the package.


2) If the client create the csv from excel then the data that have comma are enclosed within " ... " (double quotes) [as the below example] so how do the import can handle this?

You should use FORMAT = 'CSV', FIELDQUOTE = '"' options:

BULK INSERT SchoolsTemp
FROM 'C:\CSVData\Schools.csv'
WITH
(
    FORMAT = 'CSV', 
    FIELDQUOTE = '"',
    FIRSTROW = 2,
    FIELDTERMINATOR = ',',  --CSV field delimiter
    ROWTERMINATOR = '\n',   --Use to shift the control to next row
    TABLOCK
)

The best, quickest and easiest way to resolve the comma in data issue is to use Excel to save a comma separated file after having set Windows' list separator setting to something other than a comma (such as a pipe). This will then generate a pipe (or whatever) separated file for you that you can then import. This is described here.