I have a table "Images" with two fields:
I would like to export that table to a .sql file which I could import on another system. I tried to do so using the "Database unload" assistant of Oracle SQL Developer. However the generated file does just have the content for the names in it but not the data. Thus after importing I would end up with all the names but the data field would be null everywhere.
I'd really prefer it just to be one file (I saw some examples that included dumping the data to one file per field on the fs...)
Is it possible to generate such a script with SQL Developer? or is there any other way/tool to do so?
First we create a directory object pointing to the destination directory. CONN / AS SYSDBA CREATE OR REPLACE DIRECTORY BLOBS AS '/tmp/'; GRANT READ, WRITE ON DIRECTORY BLOBS TO my_user; Next we open the BLOB, read chunks into a buffer and write them to a file. Finally, you can check the file is produced correctly.
I don't think this is possible with SQL Developer (but then I don't use it very often).
The SQL client I am using - SQL Workbench/J - can do this.
There are several ways to export this data.
It can create a SQL script that uses a special (tool specific) notation to reference an external file, something like:
INSERT INTO images
(name, data)
VALUES
('foobar', {$blobfile='blob_r1_c2.data'});
The above statement can only be executed with SQL Workbench again. It is not compatible with any other SQL client.
Another alternative is to use a "blob literal", but due to Oracle's limit on 4000 bytes for a character literal, this only works for really small blob values:
INSERT INTO images
(name, data)
VALUES
('foobar', to_blob(utl_raw.cast_to_raw('......')));
where the character literal for the cast_to_raw
call would contain the hex values of the BLOB. As this requires 2 characters per "blob byte", you can't handle BLOBs larger than 2000 bytes with that. But that syntax would work for nearly all Oracle SQL tools (if they can handle scripts with very long lines).
The third alternative is to export the data into a text file that can be imported using SQL*Loader:
The text file would contain something like this:
NAME DATA foobar blob_r1_c2.data
Together with the following SQL*Loader control file:
OPTIONS (skip=1) LOAD DATA CHARACTERSET 'WE8ISO8859P15' INFILE 'images.txt' APPEND INTO TABLE IMAGES FIELDS TERMINATED BY '\t' TRAILING NULLCOLS ( NAME, lob_file_data FILLER, DATA LOBFILE(lob_file_data) TERMINATED BY EOF )
This can be loaded using SQL*Loader and is thus doesn't need SQL Workbench to import the data.
More details are in the manual
Edit
As Alex has pointed out in his comment, you can also use a DataPump export - but that requires that you have access to the file system on the server. The above solutions all store the data on the client.
If you absolutely need to use a single .sql file to import the BLOB you can generate the script using PL/SQL:
set serveroutput on
declare
lob_in blob;
i integer := 0;
lob_size integer;
buffer_size integer := 1000;
buffer raw(32767);
begin
select
data, dbms_lob.getlength(data)
into lob_in, lob_size
from images
where name = 'example.png';
for i in 0 .. (lob_size / buffer_size) loop
buffer := dbms_lob.substr(lob_in, buffer_size, i * buffer_size + 1);
dbms_output.put('dbms_lob.append(lob_out, hextoraw(''');
dbms_output.put(rawtohex(buffer));
dbms_output.put_line('''));');
end loop;
end;
Its output will be the BLOB's content encoded like:
dbms_lob.append(lob_out, hextoraw('FFD8FFE0...0000'));
dbms_lob.append(lob_out, hextoraw('00000000...0000'));
...
dbms_lob.append(lob_out, hextoraw('007FFFD9'));
Which you can load into an already inserted row with PL/SQL:
declare
lob_out blob;
begin
select data into lob_out
from images
where name = 'example.png'
for update;
dbms_lob.append(lob_out, hextoraw('FFD8FFE0...0000'));
dbms_lob.append(lob_out, hextoraw('00000000...0000'));
...
dbms_lob.append(lob_out, hextoraw('007FFFD9'));
end;
Just remember the resulting .sql file will be huge.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With