Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Copying 6000 tables and data from sqlserver to oracle ==> fastest method?

i need to copy the tables and data (about 5 yrs data, 6200 tables) stored in sqlserver, i am using datastage and odbc connection to connect and datstage automatically creates the table with data, but its taking 2-3 hours per table as tables are very large(0.5 gig, 300+columns and about 400k rows).

How can i achieve this the fastes as at this rate i am able to only copy 5 tables per day but within 30 days i need move over these 6000 tables.

like image 787
user218903 Avatar asked Apr 28 '10 00:04

user218903


People also ask

How do I load a large amount of data in SSIS?

Ensure that you are using the Fast Load option with the OLE DB Destination. This will lock the target table for optimal load patterns. You likely want to experiment with the max rows commit size (name approximate). Default will be 2 billion or so, try starting out at 50000.


1 Answers

6000 tables at 0.5 Gb each would be about 3 terabytes. Plus indexes. I probably wouldn't go for an ODBC connection, but the question is where is the bottleneck.

You have an extract stage from SQL Server. You have the transport from the SQL Server box to the Oracle box. You have the load.

If the network is the limiting capability, you are probably best off extracting to a file, compressing it, transferring the compressed file, uncompressing it, and then loading it. External tables in Oracle are the fastest way to load data from flat file (delimited or fixed length), preferably spread over multiple physical disks to spread the load and without logging.

Unless there's a significant transformation happening, I'd forget datastage. Anything that isn't extracting or loading is excess to be minimised.

like image 94
Gary Myers Avatar answered Sep 17 '22 04:09

Gary Myers