Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to import a very large csv file into an existing SQL Server table?

I have a very large csv file with ~500 columns, ~350k rows, which I am trying to import into an existing SQL Server table.

I have tried BULK INSERT, I get - Query executed successfully, 0 rows affected. Interestingly, BULK INSERT worked, in a matter of seconds, for a similar operation but for a much smaller csv file, less than 50 cols., ~77k rows.

I have also tried bcp, I get - Unexpected EOF encountered in BCP data-file. BCP copy in failed.

The task is simple - it shouldn't be hard to the limits of pure frustration. Any ideas or suggestions? Any other tools, utilities that you have successfully used to accomplish a bulk import operation or something similar? Thanks.

-- BULK INSERT

USE myDb  
BULK INSERT myTable  
FROM 'C:\Users\myFile.csv'  
WITH  
(  
FIRSTROW = 2,  
-- DATAFILETYPE = 'char',  
-- MAXERRORS = 100,  
FIELDTERMINATOR = ',',  
ROWTERMINATOR = '\n'  
);

-- bcp

bcp myDb.dbo.myTable in 'C:\Users\myFile.csv' -T -t, -c

UPDATE
I have now changed course. I've decided to join the csv files, which was my goal to begin with, outside of SQL Server so that I don't have to upload the data to a table for now. However, it'll be interesting to try to upload (BULK INSERT or 'bcp') only 1 record (~490 cols.) from the csv file, which otherwise failed, and see if it works.

like image 210
Micky W. Avatar asked Nov 30 '11 15:11

Micky W.


1 Answers

Check your file for an EOF character where it shouldn't be - BCP is telling you there is a problem with the file.

Notepad ++ may be able to load the file for you to view and search.

like image 172
Jimbo Avatar answered Nov 03 '22 01:11

Jimbo