Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is JPEG lossless when quality is set to 100?

I understand that JPEG is a lossy compression standard, and that the 'quality' factor controls the degree of compression and thus the amount of data loss.

But when the quality number is set to 100, is the resulting jpeg lossless?

like image 975
Sugrue Avatar asked Nov 02 '11 14:11

Sugrue


People also ask

Can a JPG be lossless?

Zip files are always lossless, and some image formats are lossless (PNG and TIF LZW), but JPG is not lossless. JPG is lossy compression, necessary to be able to do such heroic feats of shrinking the file data so extremely. The process has to take liberties with the data to accomplish it.

What quality should I use for JPEG?

As a general benchmark: 90% JPEG quality gives a very high-quality image while gaining a significant reduction on the original 100% file size. 80% JPEG quality gives a greater file size reduction with almost no loss in quality.

Does JPEG use lossless compression?

JPEG is a lossy format that offers a higher compression rate than PNG in the trade-off for quality.

What is quality factor in JPEG compression?

The QFactor can be a value between −1 and 255. −1 and 0 represent lossless compression, while all the values between 1 and 255 are compression ratios. For example, a factor of 10 is a compression ratio of 10. A factor of 1 gives the best lossy quality, while a factor of 255 gives the highest compression.


2 Answers

As correctly answered above, using a "typical" JPEG encoder at quality 100 does not give you lossless compression. Lossless JPEG encoding exists, but it's different in nature and seldom used.

I'm just posting to say why quality 100 does not mean lossless.

In JPEG compression information is mostly lost during the DCT coefficient quantization step (8-by-8 coefficient blocks are divided by a 8-by-8 quantization table, so they become smaller --> 'more compressible'). When you set JPEG quality to 100, no real quantization takes place (because the quantization table will be all 1s, at least with standard IJG-JPEG tables), so in fact you don't lose information here..

However, there are mainly two factors leading to information loss even when no quantization takes place:

  1. Typically, JPEG compression reduces color information (becase the human visual system is less senstitive to that than to lumimance). Therefore, even at quality 100 you may be carrying out chrominance subsampling (which means, dropping half or more Cb and Cr coefficients). When this happens, information is lost, even when no quantization happens. However, you can tell the encoder to preserve full chromimance (so called 4:4:4 color sampling).
  2. Nevertheless, JPEG encoding implies going to the DCT domain, which causes rounding of coefficients. Rounding discards some information. This will happen regardless of all other options.
like image 104
Marco Fontani Avatar answered Sep 28 '22 06:09

Marco Fontani


Jpeg is lossy regardless of the setting. At 100, you just get the LEAST loss possible.

It's easy enough to test. Whip up a simple .bmp, compress that to a q=100 jpeg, then re-extract back to a .bmp. Use Gimp/Photoshop to do a "difference" of the two bitmaps, and you'll see the lossiness - it'll be much less noticeable than on a q=50 or q=1 conversion, but still be present.

like image 42
Marc B Avatar answered Sep 28 '22 06:09

Marc B