Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

load multiple csv into one table by SQLLDR

I am using SQL LOADER to load multiple csv file in one table. The process I found is very easy like

LOAD
  DATA 
  INFILE '/path/file1.csv'
  INFILE '/path/file2.csv'
  INFILE '/path/file3.csv'
  INFILE '/path/file4.csv'
  APPEND INTO TABLE TBL_DATA_FILE
    EVALUATE CHECK_CONSTRAINTS
    REENABLE DISABLED_CONSTRAINTS
    EXCEPTIONS EXCEPTION_TABLE
  FIELDS TERMINATED BY "," 
  OPTIONALLY ENCLOSED BY '"'
  TRAILING NULLCOLS
  ( 
    COL0,
    COL1,
    COL2,
    COL3,
    COL4
  )

But I don't want to use INFILE multiple time cause if I have more than 1000 files then I have to mention 1000 times INFILE in control file script.

So my question is: is there any other way (like any loop / any *.csv) to load multiple files without using multiple infile?

Thanks, Bithun

like image 725
goldenbutter Avatar asked Jul 01 '13 12:07

goldenbutter


People also ask

Can you load data into multiple tables with SQL*loader?

! Question: I want to load data into multiple tables using SQL*Loader, but I don't understand the syntax of the control file for directing data into many different tables. Can you load many tables with SQL*Loader? Answer: The Oracle docs note that sqlldr allows multiple "into table" clauses.

How to run sqlldr for multiple CSV files in a directory?

Sign in or register to get started. It appears you are running sqlldr on windows. One way would be to put sqlldr in a for loop and call it for each csv file in the directory, using the sqlldr data= command line option.

How to put multiple files in the same table?

Provided they have the same record format, they can all go in the same table. You can do this by listing them out explicitly: infile 'file1.txt' infile 'file2.txt' ...

How to use external tables to load data into a table?

Solution 2: Use external tables and load the data using a PL/SQL procedure: CREATE PROCEDURE myload AS BEGIN FOR i IN 1 .. 1000 LOOP EXECUTE IMMEDIATE 'ALTER TABLE xtable LOCATION ('''||to_char (i,'FM9999')||'.csv'')'; INSERT INTO mytable SELECT * FROM xtable; END LOOP; END;


3 Answers

You can use a wildcards (? for a single character, * for any number) like this:

infile 'file?.csv'

;)

like image 115
Paulo Alexandre Machado Avatar answered Oct 06 '22 23:10

Paulo Alexandre Machado


Solution 1: Can you concatenate the 1000 files into on big file, which is then loaded by SQL*Loader. On unix, I'd use something like

cd path
cat file*.csv > all_files.csv
like image 32
wolφi Avatar answered Oct 07 '22 01:10

wolφi


Loop over the files from the shell:

#!/bin/bash
for csvFile in `ls file*.csv`
do
    ln -s $csvFile tmpFile.csv
    sqlldr control=file_pointing_at_tmpFile.ctl
    rm tmpFile.csv
done
like image 22
RandomSeed Avatar answered Oct 07 '22 00:10

RandomSeed