i need to export data from a table to a csv. i have the following structure (not really my table but for demo purposes)
CREATE TABLE `mytable` (
`id` int(11) DEFAULT NULL,
`mycolumn` varchar(25) DEFAULT NULL
) ENGINE=MyISAM DEFAULT CHARSET=latin1
with data (about 3000 records). now i want to export some of these records (from a script i run via cronjob)
SELECT * INTO OUTFILE '/tmp/mytable.sql'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM mytable;
it shows:
Query OK, 3 rows affected (0.00 sec)
then i do:
ls: cannot access /tmp/mytable.sql: No such file or directory
where is my file?
Once connected to the database, under Management, choose Data Export. Select the files to be exported and the configuration: Dump data and Structure: this means that the data as well as the structure of the database will be recorded. Export to Self-Contained File: the database will be exported into a new file.
When you use the INTO OUTFILE
command, it export the data to the local folder of the server not the one you are executing the query.
Example: you are on your computer (ip: 192.168.0.100) and you connect to the mysqlserver (ip: 192.168.0.101) using the mysql command: mysql -uuser -h192.168.0.101 -A database
. By executing the SELECT * INTO OUTFILE
the file is saved on the mysqlserver (ip: 192.168.0.101) and NOT on your computer (ip: 192.168.0.100).
Now, you can use a script that creates a CSV file (in your cronjob - you select all the data, generate the file and send via scp to the other server).
Or - You can also have a NFS mounted on /shared/
and when you create the file automatically the other server has it.
Or - you can simply run a mysql command in a bash script like this from your first server.
mysql -uroot test -B -e "select * from test.mytable;" | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > /tmp/filename.csv
source: http://tlug.dnho.net/node/209
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With