Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Most efficient algorithm to compress folder of files

I have a folder of files and would like to losslessly compress it as efficiently as possible.

The files are very similar to one another in that the main payload is exactly the same but a variable sized header and footer may differ quite a bit between files.

I need to be able to access any of the files very quickly as well as add additional files very quickly (not have to decompress the entire folder just to add the file to recompress again). Deletion from the folder is not very common.

Algorithmic suggestions are fine though I would prefer to just be able to use some existing library/program for this task.

like image 854
jhchen Avatar asked Feb 05 '26 09:02

jhchen


2 Answers

In this case since you have specific knowledge of the files, a custom solution would work best. Store the static main payload only once and then store the headers and footers separately. For example, say you have 3 files:

1.dat
2.dat
3.dat

Store them in the compressed file as:

payload.dat
1.header.dat
1.footer.dat
2.header.dat
2.footer.dat
3.header.dat
3.footer.dat

As far as adding files, Zip and 7zip support adding new files to an existing archive so you can use either and just append new files as needed. Personally I would recommend 7zip as I've found in most situation it provides much better compression ratios but it varies a lot depending on the exact content.

like image 117
Samuel Neff Avatar answered Feb 12 '26 07:02

Samuel Neff


Sometime back it was 7zip, not sure if something new has been made.

like image 28
sashank Avatar answered Feb 12 '26 09:02

sashank



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!