Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MYSQLDUMP failing. Couldn't execute 'SHOW TRIGGERS LIKE errors like (Errcode: 13) (6) and (1036) [duplicate]

Does anyone know why MYSQLDUMP would only perform a partial backup of a database when run with the following instruction:

"C:\Program Files\MySQL\MySQL Server 5.5\bin\mysqldump" databaseSchema -u root --password=rootPassword > c:\backups\daily\mySchema.dump

Sometimes a full backup is performed, at other times the backup will stop after including only a fraction of the database. This fraction is variable.

The database does have several thousand tables totalling about 11Gb. But most of these tables are quite small with only about 1500 records, many only have 150 - 200 records. The column counts of these tables can be in the hundreds though because of the frequency data stored.

But I am informed that the number of tables in a schema in MySQL is not an issue. There are also no performance issues during normal operation.

And the alternative of using a single table is not really viable because all of these tables have different column name signatures.

I should add that the database is in use during the backup.

Well after running the backup with instruction set:

"C:\Program Files\MySQL\MySQL Server 5.5\bin\mysqldump" mySchema -u root --password=xxxxxxx -v --debug-check --log-error=c:\backups\daily\mySchema_error.log > c:\backups\daily\mySchema.dump

I get this:

mysqldump: Couldn't execute 'SHOW TRIGGERS LIKE '\_dm\_10730\_856956\_30072013\_1375194514706\_keyword\_frequencies'': Error on delete of 'C:\Windows\TEMP\#sql67c_10_8c5.MYI' (Errcode: 13) (6)

Which I think is a permissions problem.

I doubt any one table in my schema is in the 2GB range.

I am using MySQL Server 5.5 on a Windows 7 64 bit server with 8 Gb of memory.

Any ideas?

I am aware that changing the number of files which MySQL can open, the open_files_limit parameter, may cure this matter.

Another possibility is interference from anti virus products as described here:

How To Fix Intermittent MySQL Errcode 13 Errors On Windows

like image 238
Mr Morgan Avatar asked Aug 10 '13 21:08

Mr Morgan


1 Answers

There are a few possibilities for this issue that I have run into and here is my workup:

First: Enable error/debug logging and/or verbose output, otherwise we won't know of an error that could be creating the issue:

    "c:\path\to\mysqldump" -b yourdb -u root -pRootPasswd -v --debug-check --log-error=c:\backup\mysqldump_error.log > c:\backup\visualRSS.dump

So long as debug is enabled in your distribution, you should now both be able to log errors to a file, as well as view output on a console. The issue is not always clear here, but it is a great first step.

Have you reviewed your error or general logs? Not often useful information for this issue, but sometimes there is, and every little bit helps with tracking these problems down.

Also watch SHOW PROCESSLIST while you are running this. See if you are seeing status columns like: WAITING FOR..LOCK/METADATA LOCK which would indicates the operation is unable to acquire a lock because of another operation.

Depending on info gathered above: Assuming I found nothing and had to shoot blind, here is what I would do next with some common cases I have experienced:

  • Max Packet Size errors: If you receive an error regarding max-allowed-packet-size, which, you can add --max_allowed_packet=160M to your parameters to see if you can get it large enough:

"c:\path\to\mysqldump" -b yourdb -u root -pRootPasswd -v --debug-check --log-error=c:\backup\mysqldump_error.log --max_allowed_packet=160M > c:\backup\visualRSS.dump

  • Try to reduce run time/size using --compact flag. mysqldump will add everything you need to create the schema and insert the data along with other information: You can significantly reduce run-time and file size by just requiring the dump contain only the INSERTS to your schema and avoid all statements to create the schema and other non-critical info within ea. insert.This can mitigate a lot of problems is appropriate for use, but you will want to use a separate dump with the --nodata to export your schema ea. run to allow you to create all the empty tables etc.

/Create Raw data, exclude add-drop table, comment, lock and key check statements/ "c:\path\to\mysqldump" -b yourdb -u root -pRootPasswd -v --debug-check --log-error=c:\backup\mysqldump_error.log --compact > c:\backup\visualRSS.dump

/Create Schema dump with no data:/ "c:\path\to\mysqldump" -b yourdb -u root -pRootPasswd -v --debug-check --log-error=c:\backup\mysqldump_error.log --nodata > c:\backup\visualRSS.dump

  • Locking Issues: By default, mysqldump uses the LOCK TABLE (unless you specify single transaction) to read a table while it is dumping and wants to acquire a read-lock on the table, DDL operations and your global lock type may create this case. Without seeing the hung query you will typically see a small backup file size as you described, and usually the mysqldump operation will sit until you kill it, or the server closes the idle connection. You can use the --single-transaction flag to set a REPEATABLE READ type for the transaction to essentially take a snapshot of the table without blocking operations or being block, saved for some older server vers that have issues with ALTER/TRUNCATE TABLE while in this mode.

  • FileSize issues: If I read incorrectly that this backup HAS NOT successfully run before, indication the 2GB filesize potential issue, you can try piping mysqldump output straight into something like 7zip on the fly:

    mysqldump |7z.exe a -si name_in_outfile output_path_and_filename

If you continue to have issues or there is an unavoidable issue prohibiting mysqldump from being used. Percona XtraBackup is what I prefer, or there is the Enterprise Backup for MySQL from Oracle. It is open source, far more versatile than mysqldump, has a very reliable group of developers working on it and has many great features that mysqldump does not have, like streaming/hot backups, etc. Unfortunately the windows build is old, unless you can compile from binary or run a local linux VM to handle that for you.

Very important I noticed that you are not backing up your information_schema table, this needs to be mentioned exclusively if it is of significance to your backup scheme.

like image 83
cerd Avatar answered Nov 15 '22 00:11

cerd