Use Spool to Export Query Results to a CSV File csv'; SELECT * FROM schema. table WHERE condition; spool off; In order to execute the Spool, you'll need to run it as a script (for example, if you are using Oracle SQL Developer, you may press F5 to run the Spool as a script).
In iSQL*Plus, use the preference settings to direct output to a file. Represents the name of the file to which you wish to spool. SPOOL followed by file_name begins spooling displayed output to the named file. If you do not specify an extension, SPOOL uses a default extension (LST or LIS on most systems).
First of all, sqlplus does not support export of data. Oracle has separate tools for exporting data (EXP/IMP prior to and in 10g and expdp/impdp (data pump) in 10g). If you wish to unload table data into flat files, then you need to build your own scripts to do that.
You could also use the following, although it does introduce spaces between fields.
set colsep , -- separate columns with a comma
set pagesize 0 -- No header rows
set trimspool on -- remove trailing blanks
set headsep off -- this may or may not be useful...depends on your headings.
set linesize X -- X should be the sum of the column widths
set numw X -- X should be the length you want for numbers (avoid scientific notation on IDs)
spool myfile.csv
select table_name, tablespace_name
from all_tables
where owner = 'SYS'
and tablespace_name is not null;
Output will be like:
TABLE_PRIVILEGE_MAP ,SYSTEM
SYSTEM_PRIVILEGE_MAP ,SYSTEM
STMT_AUDIT_OPTION_MAP ,SYSTEM
DUAL ,SYSTEM
...
This would be a lot less tedious than typing out all of the fields and concatenating them with the commas. You could follow up with a simple sed script to remove whitespace that appears before a comma, if you wanted.
Something like this might work...(my sed skills are very rusty, so this will likely need work)
sed 's/\s+,/,/' myfile.csv
If you are using 12.2, you can simply say
set markup csv on
spool myfile.csv
I use this command for scripts which extracts data for dimensional tables (DW). So, I use the following syntax:
set colsep '|'
set echo off
set feedback off
set linesize 1000
set pagesize 0
set sqlprompt ''
set trimspool on
set headsep off
spool output.dat
select '|', <table>.*, '|'
from <table>
where <conditions>
spool off
And works. I don't use sed for format the output file.
I see a similar problem...
I need to spool CSV file from SQLPLUS, but the output has 250 columns.
What I did to avoid annoying SQLPLUS output formatting:
set linesize 9999
set pagesize 50000
spool myfile.csv
select x
from
(
select col1||';'||col2||';'||col3||';'||col4||';'||col5||';'||col6||';'||col7||';'||col8||';'||col9||';'||col10||';'||col11||';'||col12||';'||col13||';'||col14||';'||col15||';'||col16||';'||col17||';'||col18||';'||col19||';'||col20||';'||col21||';'||col22||';'||col23||';'||col24||';'||col25||';'||col26||';'||col27||';'||col28||';'||col29||';'||col30 as x
from (
... here is the "core" select
)
);
spool off
the problem is you will lose column header names...
you can add this:
set heading off
spool myfile.csv
select col1_name||';'||col2_name||';'||col3_name||';'||col4_name||';'||col5_name||';'||col6_name||';'||col7_name||';'||col8_name||';'||col9_name||';'||col10_name||';'||col11_name||';'||col12_name||';'||col13_name||';'||col14_name||';'||col15_name||';'||col16_name||';'||col17_name||';'||col18_name||';'||col19_name||';'||col20_name||';'||col21_name||';'||col22_name||';'||col23_name||';'||col24_name||';'||col25_name||';'||col26_name||';'||col27_name||';'||col28_name||';'||col29_name||';'||col30_name from dual;
select x
from
(
select col1||';'||col2||';'||col3||';'||col4||';'||col5||';'||col6||';'||col7||';'||col8||';'||col9||';'||col10||';'||col11||';'||col12||';'||col13||';'||col14||';'||col15||';'||col16||';'||col17||';'||col18||';'||col19||';'||col20||';'||col21||';'||col22||';'||col23||';'||col24||';'||col25||';'||col26||';'||col27||';'||col28||';'||col29||';'||col30 as x
from (
... here is the "core" select
)
);
spool off
I know it`s kinda hardcore, but it works for me...
With newer versions of client tools, there are multiple options to format the query output. The rest is to spool it to a file or save the output as a file depending on the client tool. Here are few of the ways:
Using the SQL*Plus commands you could format to get your desired output. Use SPOOL to spool the output to a file.
For example,
SQL> SET colsep ,
SQL> SET pagesize 20
SQL> SET trimspool ON
SQL> SET linesize 200
SQL> SELECT * FROM scott.emp;
EMPNO,ENAME ,JOB , MGR,HIREDATE , SAL, COMM, DEPTNO
----------,----------,---------,----------,---------,----------,----------,----------
7369,SMITH ,CLERK , 7902,17-DEC-80, 800, , 20
7499,ALLEN ,SALESMAN , 7698,20-FEB-81, 1600, 300, 30
7521,WARD ,SALESMAN , 7698,22-FEB-81, 1250, 500, 30
7566,JONES ,MANAGER , 7839,02-APR-81, 2975, , 20
7654,MARTIN ,SALESMAN , 7698,28-SEP-81, 1250, 1400, 30
7698,BLAKE ,MANAGER , 7839,01-MAY-81, 2850, , 30
7782,CLARK ,MANAGER , 7839,09-JUN-81, 2450, , 10
7788,SCOTT ,ANALYST , 7566,09-DEC-82, 3000, , 20
7839,KING ,PRESIDENT, ,17-NOV-81, 5000, , 10
7844,TURNER ,SALESMAN , 7698,08-SEP-81, 1500, , 30
7876,ADAMS ,CLERK , 7788,12-JAN-83, 1100, , 20
7900,JAMES ,CLERK , 7698,03-DEC-81, 950, , 30
7902,FORD ,ANALYST , 7566,03-DEC-81, 3000, , 20
7934,MILLER ,CLERK , 7782,23-JAN-82, 1300, , 10
14 rows selected.
SQL>
Alternatively, you could use the new /*csv*/
hint in SQL Developer.
/*csv*/
For example, in my SQL Developer Version 3.2.20.10:
Now you could save the output into a file.
New in SQL Developer version 4.1, use the following just like sqlplus command and run as script. No need of the hint in the query.
SET SQLFORMAT csv
Now you could save the output into a file.
I know this is an old thread, however I noticed that no one mentioned the underline option, which can remove the underlines under the column headings.
set pagesize 50000--50k is the max as of 12c
set linesize 10000
set trimspool on --remove trailing blankspaces
set underline off --remove the dashes/underlines under the col headers
set colsep ~
select * from DW_TMC_PROJECT_VW;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With