Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Compressing binary data

On one of the working steps of my algorithm I have a big array of binary data, that I want to compress.
Which algorithm (or may be, standard class) can you advise to use to compress the data as much efficient as possible?
EDIT:
The data firstly represented as byte[n] of 0 and 1. Then I join every 8 bytes into 1 and get byte[n/8] array.

like image 666
Sergey Metlov Avatar asked May 13 '26 22:05

Sergey Metlov


1 Answers

The GZipStream or the DeflateStream are pretty standard classes to be used in such situations.

Obviously depending on the binary data you are trying to compress you will have better or worse compression ratio. For example if you try to compress a jpeg image with those algorithms you cannot expect very good compression ratio. If on the other hand the binary data represents text it will compress nicely.

like image 167
Darin Dimitrov Avatar answered May 15 '26 12:05

Darin Dimitrov



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!