I am have really huge folder I would like to gzip and split them for archive:
#!/bin/bash
dir=$1
name=$2
size=32000m
tar -czf /dev/stdout ${dir} | split -a 5 -d -b $size - ${name}
Are there way to speed up this with gnu parallel? thanks.
It seems the best tool for parallel gzip compression is pigz. See the comparisons.
With it you can have a command like this:
tar -c "${dir}" | pigz -c | split -a 5 -d -b "${size}" - "${name}"
With its option -p you could also specify the number of threads to use (default is the number of online processors, or 8 if unknown). See pigz --help or man pigz for more info.
UPDATE
Using GNU parallel you could do something this:
contents=("$dir"/*)
outdir=/somewhere
parallel tar -cvpzf "${outdir}/{}.tar.gz" "$dir/{}" ::: "${contents[@]##*/}"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With