Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Fast way to load over a billion rows into Oracle from text files

I have approximately 20 text files of data all in the same format and all tab delimited. These files are hundreds of megabytes each and all together I am expecting there to be about 1.2 billion rows of data.

My question - what is the best (and fastest) way to load these into an Oracle table? I attempted to load them via the built in import feature in TOAD, but that was only doing about 7,500 records a minute. At that rate, I'd be waiting a very long time for the import to complete.

I have no problem kicking off the process once for each input file, but I need a way or ideas on how to import these quickly.

like image 923
user1181913 Avatar asked Jan 17 '23 09:01

user1181913


1 Answers

Assuming you have the ability to copy these files to the database server, the most efficient approach should be to use external tables. You'd then just need to fire up a SELECT statement to load the data.

like image 50
Justin Cave Avatar answered Feb 21 '23 00:02

Justin Cave