tar|gzip is wonderful, except files can get too big, and transferring them over network gets complicated. DOS era archivers were routinely used to create multipart archives, one per floppy, but gzip doesn't seem to have such option (because of Unix streaming philosophy).
So what's the easiest and most robust way of doing this under Linux (and obviously with archive size ~2GB, not 1.44MB)?
you could split it up into pieces by using /usr/bin/split (with the "-b" option) - read 'man split'
I don't bother using gzip for archiving any more, just for unpacking other people's archives who haven't yet been converted :-)
7zip has insane-level compression (although I haven't put it head-to-head in all scenarios) and it also supports creating volumes, which is in answer to your specific question.
For example, the following command compresses the current directory tree into 1G volumes called /backups/2021_09_28.7z.NNN where NNN ranges from 001 to whatever value it needs:
7z a -r -v1g -y /backups/2021_09_28 .
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With