Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Parallel tar with split for large folders

I am have really huge folder I would like to gzip and split them for archive:

#!/bin/bash
dir=$1
name=$2
size=32000m
tar -czf /dev/stdout ${dir} | split -a 5  -d -b $size - ${name}

Are there way to speed up this with gnu parallel? thanks.

like image 659
Arman Avatar asked Mar 25 '26 05:03

Arman


1 Answers

It seems the best tool for parallel gzip compression is pigz. See the comparisons.

With it you can have a command like this:

tar -c "${dir}" | pigz -c | split -a 5 -d -b "${size}" - "${name}"

With its option -p you could also specify the number of threads to use (default is the number of online processors, or 8 if unknown). See pigz --help or man pigz for more info.

UPDATE

Using GNU parallel you could do something this:

contents=("$dir"/*)
outdir=/somewhere
parallel tar -cvpzf "${outdir}/{}.tar.gz" "$dir/{}" ::: "${contents[@]##*/}"
like image 65
konsolebox Avatar answered Mar 27 '26 09:03

konsolebox



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!