I am using C# and using SqlBulkCopy. I have a problem though. I need to do a mass insert into one table then another mass insert into another table.
These 2 have a PK/FK relationship.
Table A Field1 -PK auto incrementing (easy to do SqlBulkCopy as straight forward) Table B Field1 -PK/FK - This field makes the relationship and is also the PK of this table. It is not auto incrementing and needs to have the same id as to the row in Table A.
So these tables have a one to one relationship but I am unsure how to get back all those PK Id that the mass insert made since I need them for Table B.
Edit
Could I do something like this?
SELECT * FROM Product WHERE NOT EXISTS (SELECT * FROM ProductReview WHERE Product.ProductId = ProductReview.ProductId AND Product.Qty = NULL AND Product.ProductName != 'Ipad')
This should find all the rows that where just inserted with the sql bulk copy. I am not sure how to take the results from this then do a mass insert with them from a SP.
The only problem I can see with this is that if a user is doing the records one at a time and a this statement runs at the same time it could try to insert a row twice into the "Product Review Table".
So say I got like one user using the manual way and another user doing the mass way at about the same time.
manual way. 1. User submits data 2. Linq to sql Product object is made and filled with the data and submited. 3. this object now contains the ProductId 4. Another linq to sql object is made for the Product review table and is inserted(Product Id from step 3 is sent along).
Mass way. 1. User grabs data from a user sharing the data. 2. All Product rows from the sharing user are grabbed. 3. SQL Bulk copy insert on Product rows happens. 4. My SP selects all rows that only exist in the Product table and meets some other conditions 5. Mass insert happens with those rows.
So what happens if step 3(manual way) is happening at the same time as step 4(mass way). I think it would try to insert the same row twice causing a primary constraint execption.
The SqlBulkCopy class can be used to write data only to SQL Server tables. However, the data source is not limited to SQL Server; any data source can be used, as long as the data can be loaded to a DataTable instance or read with a IDataReader instance.
By default, a bulk copy operation is its own transaction. When you want to perform a dedicated bulk copy operation, create a new instance of SqlBulkCopy with a connection string, or use an existing SqlConnection object without an active transaction.
Upload the data to the temporary table, then perform the SqlBulkCopy update. Using SqlBulkCopy(), upload the datatable's data to the temporary table. Then execute a SQL command to update the main table's data from the temporary table. Finally drop the temporary table.
In that scenario, I would use SqlBulkCopy
to insert into a staging table (i.e. one that looks like the data I want to import, but isn't part of the main transactional tables), and then at the DB to a INSERT
/SELECT
to move the data into the first real table.
Now I have two choices depending on the server version; I could do a second INSERT
/SELECT
to the second real table, or I could use the INSERT
/OUTPUT
clause to do the second insert , using the identity rows from the table.
For example:
-- dummy schema CREATE TABLE TMP (data varchar(max)) CREATE TABLE [Table1] (id int not null identity(1,1), data varchar(max)) CREATE TABLE [Table2] (id int not null identity(1,1), id1 int not null, data varchar(max)) -- imagine this is the SqlBulkCopy INSERT TMP VALUES('abc') INSERT TMP VALUES('def') INSERT TMP VALUES('ghi') -- now push into the real tables INSERT [Table1] OUTPUT INSERTED.id, INSERTED.data INTO [Table2](id1,data) SELECT data FROM TMP
If your app allows it, you could add another column in which you store an identifier of the bulk insert (a guid for example). You would set this id explicitly.
Then after the bulk insert, you just select the rows that have that identifier.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With