Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to do large file parallel encryption using GnuPG and GNU parallel?

I'm trying to write a parallel compress / encrypt backup script for archiving using GNU parallel, xz and GnuPG. The core part's of script is:

tar --create --format=posix --preserve-permissions --same-owner --directory $BASE/$name --to-stdout . \
    | parallel --pipe --recend '' --keep-order --block-size 128M "xz -9 --check=sha256 | gpg --encrypt --recipient $RECIPIENT" \
    | pv > $TARGET/$FILENAME

Without GnuPG encryption, it works great (uncompress and untar works), but after adding parallel encryption, it's fail to decrypt with below error:

[don't know]: invalid packet (ctb=0a)
gpg: WARNING: encrypted message has been manipulated!
gpg: decrypt_message failed: Unexpected error
: Truncated tar archive
tar: Error exit delayed from previous errors.

Because uncompressed size is as same as gnu parallel's block size(around 125M), I assume that it's related GnuPG's support of partial block encryption. How can I solve this problem?


FYI

Another parallel gpg encrption issue about random number generation

https://unix.stackexchange.com/questions/105059/parallel-pausing-and-resuming

like image 516
Yongbin Yu Avatar asked Sep 17 '17 05:09

Yongbin Yu


Video Answer


2 Answers

Pack

tar --create --format=posix --preserve-permissions --same-owner --directory $BASE/$name --to-stdout . |
    parallel --pipe --recend '' --keep-order --block-size 128M "xz -9 --check=sha256 | gpg --encrypt --recipient $RECIPIENT;echo bLoCk EnD" |
    pv > $TARGET/$FILENAME

Unpack

cat $TARGET/$FILENAME |
  parallel --pipe --recend 'bLoCk EnD\n' -N1 --keep-order --rrs 'gpg --decrypt | xz -d' |
  tar tv

-N1 is needed to make sure we pass a single record at a time. GnuPG does not support decrypting multiple merged records.

like image 51
Ole Tange Avatar answered Sep 23 '22 22:09

Ole Tange


GnuPG does not support concatenating multiple encryption streams and decrypting them at once. You will have to store multiple files, and decrypt them individually. If I'm not mistaken, your command even mixes up the outputs of all parallel instances of GnuPG, so the result is more or less random garbage.

Anyway: GnuPG also takes care of compression, have a look at the --compression-algo option. If you prefer to use xz, apply --compression-algo none so GnuPG does not try to compress the already-compressed message again. Encryption has massive support by CPU-instructions ourdays, xz -9 might in fact be more time intensive than encryption (although I did not benchmark this).

like image 35
Jens Erat Avatar answered Sep 21 '22 22:09

Jens Erat