Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Edit very large sql dump/text file (on linux)

I have to import a large mysql dump (up to 10G). However the sql dump already predefined with a database structure with index definition. I want to speed up the db insert by removing the index and table definition.

That means I have to remove/edit the first few lines of a 10G text file. What is the most efficient way to do this on linux?

Programs that require loading the entire file into RAM will be an overkill to me.

like image 234
geo Avatar asked Mar 31 '09 02:03

geo


1 Answers

Rather than removing the first few lines, try editing them to be whitespace.

The hexedit program can do this-- it reads files in chunks, so opening a 10GB file is no different from opening a 100KB file to it.

$ hexedit largefile.sql.dump
tab (switch to ASCII side)
space (repeat as needed until your header is gone)
F2 (save)/Ctrl-X (save and exit)/Ctrl-C (exit without saving)
like image 72
rmmh Avatar answered Sep 25 '22 00:09

rmmh