My company's currently moving our databases around, shifting one set of tables out from the old MySQL instance into the new. We've done some development prior to this migration, and some tables' structure has been altered from the original (eg. columns were dropped).
So currently I've dumped the data from the old database and am now attempting to reinsert them into the new table. Of course, the import borks when it tries to insert rows with more fields than the table has.
What's the best way (preferably scriptable, because I foresee myself having to do this a few more times) to import only the fields I need into the new table?
Update the following to suit:
SELECT 'INSERT INTO NEW_TABLE ... ('+ to.column +');'
FROM OLD_TABLE ot
You need an INSERT statement for the table on the new database, with column list. Then populate the value portion accordingly based on the values in the old table. Run in the old environment, and you'll have your inserts with data for the new environment - just copy'n'paste into a script.
Mind though that datatypes have to be handled accordingly - dates (incl. time), and strings will have to be handled because you're dealing in text.
First of all, create new database with old structure, or temp tables in current database. Then run script with insert statements for each row, but in values must be only those fields that are in new structure.
insert into newTable select row1,row2 from tempTable
Use the fastest way, load data infile :
-- Dump datas
SELECT * INTO OUTFILE 'mybigtable.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM mybigtable
-- Load datas
LOAD DATA LOCAL INFILE 'mybigtable.csv'
INTO TABLE mynewbigtable
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
(@col1,@col2,@col3,@col4) set name=@col4,id=@col2;
Ref :
http://dev.mysql.com/doc/refman/5.6/en/insert-speed.html
http://dev.mysql.com/doc/refman/5.6/en/load-data.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With