I have a compression command which I pipe to a md5sum check command as such:
   tar zcvpf file.tar.gz file | xargs -I '{}' bash -c "test -f '{}' && md5sum '{}'" | tee file_mybackup.md5
I would now like to split the tar.gz file into 100MB chunks. I can do this by:
   tar zcvpf - file | split -d -b 1M - file.tar.gz.
Is there a way that I can pipe the output of the tar command to simultaneously perform the split command and md5sum check? I know that split does not output to STDOUT so I can't pipe from the split command to the md5sum command. I tried to use a named pipe:
    mkfifo newpipe
    tar zcvpf - file | tee newpipe | split -d -b 1M - file.tar.gz. &
    cat newpipe | xargs -I '{}' bash -c "test -f '{}' && md5sum '{}'" | tee file_mybackup.md5
However this fails to output the md5sum output. Any help would be appreciated
Ok I eventually came to a solution
    tar zcvpf >(split -d -b 1M - file.) file | xargs -I '{}' bash -c "test -f '{}' && md5sum '{}'" | tee file.md5
I redirected stdin to the split command within the inital tar command whilst simultaneously piping the output of this tar to xargs. Hope this is of help to someone else.
Your Command:
tar zcvpf file.tar.gz file | xargs -I '{}' bash -c "test -f '{}' && md5sum '{}'" | tee file_mybackup.md5
...is no more efficient (in terms of minimizing drive reads) then this simpler command (i.e. no need of xargs):
tar zcvpf file.tar.gz file && md5sum file | tee file_mybackup.md5
because in both cases file is read once by tar and then a second time by md5sum
Unless there is something I am missing in what you are trying to achieve then the command you would need to minimize drive reads is:
cat file | tee >(md5sum | sed 's/-/file/' > file_mybackup.md5) | gzip -c | split -d -b 1M - file.gz.
Note: Since you are archiving a single file and not a directory I just used gzip instead of tar. Yes this does not store file permissions and will not work on a directory! ...was that a requirement? 
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With