I've create a plain and siple backup script that only backs up certain files and folders.
tar -zcf $DIRECTORY/var.www.tar.gz /var/www
tar -zcf $DIRECTORY/development.tar.gz /development
tar -zcf $DIRECTORY/home.tar.gz /home
Now this script runs for about 30mins then gives me the following error
gzip: stdout: File too large
Any other solutions that I can use to backup my files using shell scripting or a way to solve this error? I'm grateful for any help.
File too large is a error message from your libc: The output has exceeded the file size limit of your filesystem.
So this is not a gzip issue.
Options: Use another Filesystem or use split:
tar czf - www|split -b 1073741824 - www-backup.tar.
creates the backup.
Restore it from multiple parts:
cat www-backup.tar.*|gunzip -c |tar xvf -
Can the file system you are backing up to support large files?
Specifically, FAT32 has a limit of ~4GB in a single file, and other filesystems have similar limits.
If your backup is running for 30 minutes, the file could easily be getting that sort of size.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With