Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I solve a BCP utility error when importing a CSV?

My input files are on my local drive c:/, and I use Trusted Windows authentication to SQL Server.

All bcp discussions are quote old, I have a feeling that it has been replaced with something else (?). Other post mostly about Export.

My final goals is to copy 80 csv files into 80 tables, I don't want to do it with SSIS to avoid creating individual components, all my tables are different.

But for the final part I plan to use SSIS with Execute Process Task
I'm running bcp from command prompt, is it possible to run from SSMS ?

This is my input .csv, comma delimited:

c1,c2,c3,cAction,a1,a2,a3
8,9,2,Create Service Placements,2,2,3.0
9,10,2,Create Program Placements,3,2,4.0

This is my command line:

C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\170\Tools\Binn>bcp "Txbit.dbo.clear_roles" in C:\Import\BKUPDaily\clear_roles200.csv -T -S txMSServer -c  

And this is error, I don't see any option to specify driver.

C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\170\Tools\Binn>bcp "Txbit.dbo.clear_roles" in C:\Import\BKUPDaily\clear_roles200.csv -T -S txMSServer -c
Starting copy...
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC Driver 17 for SQL Server]Unexpected EOF encountered in BCP data-file
0 rows copied.
Network packet size (bytes): 4096
Clock Time (ms.) Total : 1

And these are the drivers on my local machine:

enter image description here

like image 979
Mich28 Avatar asked Jan 23 '26 16:01

Mich28


1 Answers

Just throwing into the ring DbaTools. This has a command for importing CSVs: Import-DbaCsv. Importing your CSV sample data into a table is actually somewhat trivial with this:

A sample table

CREATE TABLE dbo.clear_roles (c1 int,
                              c2 int,
                              c3 int,
                              cAction nvarchar(100),
                              a1 int,
                              a2 int,
                              a3 int);
GO

And then the command looks like this:

Import-DbaCsv -Path C:\temp\clear_roles200.csv -SqlInstance 'srv-sql2022-dev\SANDBOX' -Database Sandbox -Table 'clear_roles'

If your goal is the loop through the files and insert the data into the relative tables, assuming that the name of the table is the same as file (minus the last 7 characters), you could likely do something like this:

$CsvFiles = Get-ChildItem -Path 'C:\temp\79680159' | Where-Object {$_.Extension -eq '.csv'}
$SqlInstance = 'srv-sql2022-dev\SANDBOX'
$Database = 'Sandbox'

foreach ($CsvFile in $CsvFiles){
    $Table = $CsvFile.Name.Substring(0,$CsvFile.Name.Length-7)
    Import-DbaCsv -Path $CsvFile.FullName -SqlInstance $SqlInstance -Database $Database -Table $Table
}

If the name of the table isn't the name of file minutes the last 7 characters, you'll need to alter the above script appropriately the extract the table's name.

With the sample data you've given, and another file (I've named vague_users100.csv):

v1,v2,vName,u1
0,9,Smith,12.2
,17,Brown,1243.45

And a table defined as:

CREATE TABLE dbo.vague_users (v1 int,
                              v2 int,
                              vName sysname,
                              u1 decimal(12,2));

This populates both tables with the expected data, and outputs the following:

ComputerName  : SRV-SQL2022-DEV
InstanceName  : SANDBOX
SqlInstance   : SRV-SQL2022-DEV\SANDBOX
Database      : Sandbox
Table         : clear_roles
Schema        : dbo
RowsCopied    : 2
Elapsed       : 170.51 ms
RowsPerSecond : 11.8
Path          : C:\temp\79680159\clear_roles200.csv

ComputerName  : SRV-SQL2022-DEV
InstanceName  : SANDBOX
SqlInstance   : SRV-SQL2022-DEV\SANDBOX
Database      : Sandbox
Table         : vague_users
Schema        : dbo
RowsCopied    : 2
Elapsed       : 112.2 ms
RowsPerSecond : 18
Path          : C:\temp\79680159\vague_users100.csv
like image 58
Larnu Avatar answered Jan 25 '26 08:01

Larnu



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!