Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

gzip: stdout: File too large when running customized backup script

Tags:

bash

shell

I've create a plain and siple backup script that only backs up certain files and folders.

tar -zcf $DIRECTORY/var.www.tar.gz /var/www
tar -zcf $DIRECTORY/development.tar.gz /development
tar -zcf $DIRECTORY/home.tar.gz /home

Now this script runs for about 30mins then gives me the following error

gzip: stdout: File too large

Any other solutions that I can use to backup my files using shell scripting or a way to solve this error? I'm grateful for any help.

like image 949
Elitmiar Avatar asked Apr 23 '10 15:04

Elitmiar


2 Answers

File too large is a error message from your libc: The output has exceeded the file size limit of your filesystem.

So this is not a gzip issue.

Options: Use another Filesystem or use split:

tar czf - www|split -b 1073741824 - www-backup.tar.

creates the backup.

Restore it from multiple parts:

cat www-backup.tar.*|gunzip -c |tar xvf -
like image 179
Jürgen Hötzel Avatar answered Nov 04 '22 17:11

Jürgen Hötzel


Can the file system you are backing up to support large files?

Specifically, FAT32 has a limit of ~4GB in a single file, and other filesystems have similar limits.

If your backup is running for 30 minutes, the file could easily be getting that sort of size.

like image 21
Adrian Avatar answered Nov 04 '22 16:11

Adrian