Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Ensuring exported JPEG is less then maximum file size

I currently have an application which takes a screenshot of a presenter's desktop and then broadcasts it via a custom protocol to the viewers. In order for the images to be transfered quick enough to get a frame rate of 2 - 3 images per second, I need to ensure the image size is always less then ~ 300 KB.

I'm using C# for the presenter application, which encodes the screenshot into a JPEG via the process below. My concern is that the image quality can vary greatly when using a static compression setting. If I have the application capturing my screen, the images output will be ~200 KB when I have Visual Studio full screen, but if I minimize my screen and have my desktop background appearing, it will be ~400 KB.

I could put the encoding process into a loop, and continuously decrease the image size until the size of the byte array is less then 300 KB, but that seems like a tedious operation. Is there any other method I could use?

Thanks in advance.

// get the screenshot
System.Drawing.Rectangle totalSize = System.Drawing.Rectangle.Empty;
//foreach (Screen s in Screen.AllScreens)
totalSize = System.Drawing.Rectangle.Union(totalSize, Screen.PrimaryScreen.Bounds);
Bitmap screenShotBitmap = new Bitmap(totalSize.Width, totalSize.Height, System.Drawing.Imaging.PixelFormat.Format32bppRgb);
screenShotBitmap.SetResolution(96, 96);
Graphics screenShotGraphics = Graphics.FromImage(screenShotBitmap);
screenShotGraphics.CopyFromScreen(totalSize.X, totalSize.Y,
                    0, 0, totalSize.Size, CopyPixelOperation.SourceCopy);
screenShotGraphics.Dispose();

// image codec information
ImageCodecInfo imageCodecInfo = GetEncoderInfo("image/jpeg");

// encoder settings
System.Drawing.Imaging.Encoder encoderQuality;
System.Drawing.Imaging.Encoder encoderColor;
encoderQuality = System.Drawing.Imaging.Encoder.Quality;
encoderColor = System.Drawing.Imaging.Encoder.ColorDepth;

// compression & quality for JPEG output
Int64 quality = 40L;

// storage for exported JPEG
byte[] screenShotByteArray;

// encoder parameters
EncoderParameter encoderQualityParameter = new EncoderParameter(encoderQuality, quality);
//EncoderParameter encoderColorParameter = new EncoderParameter(encoderColor, 8L);

// encoder parameters table
EncoderParameters encoderParameters = new EncoderParameters(1);
encoderParameters.Param[0] = encoderQualityParameter;
//encoderParameters.Param[1] = encoderColorParameter;

// get the code into a memory stream
MemoryStream screenShotMemoryStream = new MemoryStream();
screenShotBitmap.Save(screenShotMemoryStream, imageCodecInfo, encoderParameters);

// convert to a byte array
screenShotByteArray = screenShotMemoryStream.GetBuffer();

// close the memory stream
screenShotMemoryStream.Close();
like image 404
BSchlinker Avatar asked May 11 '11 20:05

BSchlinker


Video Answer


1 Answers

If you're putting things into a loop, be careful to use something similar to binary search instead of just increasing/decreasing the quality parameter by a fixed amount until the desired size is reached.

EDIT: Explaining the binary search a bit. Take the hypothetical case of a picture that compresses to quality*10000 bytes, so the optimal quality setting would be 30. Now the naive approach would be to try some fixed quality setting (f.e. 80 which would give 800,000 bytes) and then decreasing by a certain amount until 300000 bytes are reached. If you f.e. decrease image quality by 5 in each steps, you'd try 12 quality settings with this method until you found the desired setting. A binary search would give a result faster, like this:

Quality    Size    Next step
80         800000  Too big, so quality := quality/2
40         400000  Too big, so quality := quality/2
20         200000  Too small, so quality := (40+20)/2
30         300000  Reached desired size

This gives the result after only 4 tries (or 3 depending on 200000 bytes being too small or just fine for you). As size doesn't have a linear relation to quality, this example is a bit unrealistic, but binary search should still give you better results than the naive approach.

You could also use some typical images for "training". Encode them using different quality settings (f.e. 100,90,...,20,10) and see how big they get relative to their original size. This might give a good first estimate in most cases although you will still have to adjust when encountering images with much more or less details in them.

Alternatively, have a look at JPEG2000 encoders, those have the option to set a filesize instead of quality.

EDIT: I don't know of JPEG2000 encoding libaries for C#, there only seem to be decoders floating around, so this could get more complicated than I thought at first. You might give CSJ2K a try, but the description doesn't sound like it's ready-to-use.

like image 109
schnaader Avatar answered Oct 18 '22 16:10

schnaader