Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to copy a huge table data into another table in SQL Server

I have a table with 3.4 million rows. I want to copy this whole data into another table.

I am performing this task using the below query:

select * 
into new_items 
from productDB.dbo.items

I need to know the best possible way to do this task.

like image 804
sqlchild Avatar asked Mar 14 '11 08:03

sqlchild


People also ask

How do I copy an entire table into another table?

The SQL INSERT INTO SELECT Statement The INSERT INTO SELECT statement copies data from one table and inserts it into another table. The INSERT INTO SELECT statement requires that the data types in source and target tables match. Note: The existing records in the target table are unaffected.


4 Answers

I had the same problem, except I have a table with 2 billion rows, so the log file would grow to no end if I did this, even with the recovery model set to Bulk-Logging:

insert into newtable select * from oldtable

So I operate on blocks of data. This way, if the transfer is interupted, you just restart it. Also, you don't need a log file as big as the table. You also seem to get less tempdb I/O, not sure why.

set identity_insert newtable on
DECLARE @StartID bigint, @LastID bigint, @EndID bigint
select @StartID = isNull(max(id),0) + 1
from newtable

select @LastID = max(ID)
from oldtable

while @StartID < @LastID
begin
    set @EndID = @StartID + 1000000

    insert into newtable (FIELDS,GO,HERE)
    select FIELDS,GO,HERE from oldtable (NOLOCK)
    where id BETWEEN @StartID AND @EndId

    set @StartID = @EndID + 1
end
set identity_insert newtable off
go

You might need to change how you deal with IDs, this works best if your table is clustered by ID.

like image 82
Mathieu Longtin Avatar answered Oct 21 '22 20:10

Mathieu Longtin


If you are copying into a new table, the quickest way is probably what you have in your question, unless your rows are very large.

If your rows are very large, you may want to use the bulk insert functions in SQL Server. I think you can call them from C#.

Or you can first download that data into a text file, then bulk-copy (bcp) it. This has the additional benefit of allowing you to ignore keys, indexes etc.

Also try the Import/Export utility that comes with the SQL Management Studio; not sure whether it will be as fast as a straight bulk-copy, but it should allow you to skip the intermediate step of writing out as a flat file, and just copy directly table-to-table, which might be a bit faster than your SELECT INTO statement.

like image 43
Stephen Chung Avatar answered Oct 21 '22 21:10

Stephen Chung


I have been working with our DBA to copy an audit table with 240M rows to another database.

Using a simple select/insert created a huge tempdb file.

Using a the Import/Export wizard worked but copied 8M rows in 10min

Creating a custom SSIS package and adjusting settings copied 30M rows in 10Min

The SSIS package turned out to be the fastest and most efficent for our purposes

Earl

like image 12
earlxtr Avatar answered Oct 21 '22 20:10

earlxtr


Here's another way of transferring large tables. I've just transferred 105 million rows between two servers using this. Quite quick too.

  1. Right-click on the database and choose Tasks/Export Data.
  2. A wizard will take you through the steps but you choosing your SQL server client as the data source and target will allow you to select the database and table(s) you wish to transfer.

For more information, see https://www.mssqltips.com/sqlservertutorial/202/simple-way-to-export-data-from-sql-server/

like image 8
user489998 Avatar answered Oct 21 '22 20:10

user489998