Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Bilinear interpolation - DirectX vs. GDI+

I have a C# app for which I've written GDI+ code that uses Bitmap/TextureBrush rendering to present 2D images, which can have various image processing functions applied. This code is a new path in an application that mimics existing DX9 code, and they share a common library to perform all vector and matrix (e.g. ViewToWorld/WorldToView) operations. My test bed consists of DX9 output images that I compare against the output of the new GDI+ code.

A simple test case that renders to a viewport that matches the Bitmap dimensions (i.e. no zoom or pan) does match pixel-perfect (no binary diff) - but as soon as the image is zoomed up (magnified), I get very minor differences in 5-10% of the pixels. The magnitude of the difference is 1 (occasionally 2)/256. I suspect this is due to interpolation differences.

Question: For a DX9 ortho projection (and identity world space), with a camera perpendicular and centered on a textured quad, is it reasonable to expect DirectX.Direct3D.TextureFilter.Linear to generate identical output to a GDI+ TextureBrush filled rectangle/polygon when using the System.Drawing.Drawing2D.InterpolationMode.Bilinear setting?

For this (magnification) case, the DX9 code is using this (MinFilter,MipFilter set similarly):
Device.SetSamplerState(0, SamplerStageStates.MagFilter, (int)TextureFilter.Linear);

and the GDI+ path is using: g.InterpolationMode = InterpolationMode.Bilinear;

I thought that "Bilinear Interpolation" was a fairly specific filter definition, but then I noticed that there is another option in GDI+ for "HighQualityBilinear" (which I've tried, with no difference - which makes sense given the description of "added prefiltering for shrinking")

Followup Question: Is it reasonable to expect pixel-perfect output matching between DirectX and GDI+ (assuming all external coordinates passed in are equal)? If not, why not?

Clarification: The images I'm using are opaque grayscale (R=G=B, A=1) using Format32bppPArgb.

Finally, there are a number of other APIs I could be using (Direct2D, WPF, GDI, etc.) - and this question generally applies to comparing the output of "equivalent" bilinear interpolated output images across any two of these. Thanks!

like image 749
holtavolt Avatar asked Mar 11 '11 16:03

holtavolt


2 Answers

DirectX runs mostly in the GPU and DX9 may be running shaders. GDI+ runs on completely different algorithms. I don't think it is reasonable to expect the two to come up with exactly pixel-matching outputs.

I'd expect DX9 to have better quality than GDI+, which is a step improvement over the old GDI but not much. GDI+ is long understood to have trouble with anti-aliasing lines and also with preserving quality in image scaling (which seems to be your problem). In order to have something similar in quality than latest-generation GPU texture processing, you'll need to move to WPF graphics. This gives quality similar to DX.

WPF also uses the GPU (if available) and falls back to software rendering (if no GPU), therefore the output between GPU and software rendering are reasonably close.

EDIT: Although this has been picked as the answer, it is only an early attempt to explain and doesn't really address the true reason. The reader is referred to discussions laid out in the comments to the question and to the answers instead.

like image 111
Stephen Chung Avatar answered Nov 02 '22 02:11

Stephen Chung


Why do you make the assumption that they use the same formula?

Even if they do use the same formula and you accept that the implementation is different would you expect the output to be the same?

At the end of the day the code is designed to work with perception not be mathematically precise. Although you can get this with CUDA if you want.

Rather than being suprised that you get different results i would be very suprised if you got pixel perfect matches.

the way they represent colour is different ... I know for a fact nvidia uses a float(maybe double) to represent colour wheras GDI uses int i believe.

http://en.wikipedia.org/wiki/GPGPU

In DX9 shader 2.0 appears which is when the implementation of colour switched from int to 24 and 32 bit floats.

try comparing ati/amd rendering to nvidia rendering and you can clearly see that colour is very different. I first noticed this in quake 2 ... the difference between the 2 cards was staggering - of course that is due to a great many number of things, least of which is their bilinier interp implementation.

EDIT: the info about how the specification was made happeend after i answered. Anyway i think the datatypes used to store it iwll be different no matter how you specify it. Moreover the implementation of float is likley to be different. I may be wrong but im pretty sure that c# implements float differently to the C compiler that nvidia uses. (and that assumes that GDI+ doesnt just convert the float into the equivalent int ....)

Even if i am wrong about that I would enerally hold it to be exceptional to expect 2 different implementations of an algorithm to be identical. they are optomised for speed as a result the difference in optomisation will directly translate to a difference in image quality as this speed will come from a different approach to cutting corners/approximation.

like image 35
John Nicholas Avatar answered Nov 02 '22 02:11

John Nicholas