Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Insert large amount of data efficiently with SQL

Hi I often have to insert a lot of data into a table. For example, I would have data from excel or text file in the form of

1,a
3,bsdf
4,sdkfj
5,something
129,else

then I often construct 6 insert statements in this example and run the SQL script. I found this was slow when I have to send thousands of small packages to server, it also causes extra overhead to the network.

What's your best way of doing this?

Update: I'm using ORACLE 10g.

like image 472
Dean Avatar asked Oct 06 '11 23:10

Dean


People also ask

How do I insert a lot of data into SQL?

INSERT-SELECT-UNION query to insert multiple records Thus, we can use INSERT-SELECT-UNION query to insert data into multiple rows of the table. The SQL UNION query helps to select all the data that has been enclosed by the SELECT query through the INSERT statement.

How can I speed up bulk insert in SQL?

Removing indexes prior to large inserts on a table, including when using SQL Bulk Insert, may be a best practice to increase performance.

How do you handle a large amount of data in SQL?

The most recommended and best option is to have a STANDBY server, restore the backup of the production database on that server, and then run the DBCC command. If the consistency checks run ok on the standby database, the production database should be ok as it is the source of the standby.


2 Answers

Use Oracle external tables.

See also e.g.

  • OraFaq about external tables
  • What Tom thinks about external tables
  • René Nyffenegger's notes about external tables

A simple example that should get you started

You need a file located in a server directory (get familiar with directory objects):

SQL> select directory_path from all_directories where directory_name = 'JTEST';

DIRECTORY_PATH
--------------------------------------------------------------------------------
c:\data\jtest

SQL> !cat ~/.gvfs/jtest\ on\ 192.168.xxx.xxx/exttable-1.csv
1,a
3,bsdf
4,sdkfj
5,something
129,else

Create an external table:

create table so13t (
  id number(4),
  data varchar2(20)
)
organization external (
  type oracle_loader
  default directory jtest /* jtest is an existing directory object */
  access parameters (
    records delimited by newline
    fields terminated by ','
    missing field values are null
  )
  location ('exttable-1.csv') /* the file located in jtest directory */
)
reject limit unlimited;

Now you can use all the powers of SQL to access the data:

SQL> select * from so13t order by data;

        ID DATA
---------- ------------------------------------------------------------
         1 a
         3 bsdf
       129 else
         4 sdkfj
         5 something
like image 150
user272735 Avatar answered Sep 24 '22 12:09

user272735


Im not sure if this works in Oracle but in SQL Server you can use BULK INSERT sql statement to upload data from a txt or a csv file.

BULK
INSERT [TableName]
FROM 'c:\FileName.txt'
WITH 
(
    FIELDTERMINATOR = ',',
    ROWTERMINATOR = '\n'
)
GO

Just make sure that the table columns correctly matches whats in the txt file. With a more complicated solution you may want to use a format file see the following: http://msdn.microsoft.com/en-us/library/ms178129.aspx

like image 42
Aaron Avatar answered Sep 22 '22 12:09

Aaron