When I draw a stretched Texture2D
, the pixels receive a Blur-like effect.
I want to use 'pixelated' graphics in my game and would like to know how to disable this in favor of the simplest nearest neighbor zoom.
I've created this picture for illustration: (x4 zoom)
In what way can I accomplish this?
It is also known as Nearest neighbor interpolation. As its name suggest , in this method , we just replicate the neighboring pixels. As we have already discussed in the tutorial of Sampling , that zooming is nothing but increase amount of sample or pixels. This algorithm works on the same principle.
Zoom in a digital image means replacing the number of display pixels per image pixel only in appearance. Digital image shrinking is done in same manner as zoom in. In this paper we are using interpolation methods which have many advantages as we discussed. KEYWORDS: Shrinking, Interpolation, Bilinear, PDA.
The scale operator performs a geometric transformation which can be used to shrink or zoom the size of an image (or part of an image).
Zooming simply means enlarging a picture in a sense that the details in the image became more visible and clear. Digital zoom is basically image processing within a camera. During a digital zoom, the center of the image is magnified and the edges of the picture got crop out.
In XNA 4, change your SpriteBatch.Begin() to set the sampler state to SamplerState.PointClamp
If you're using a shader to draw the image, you can modify the sampler state:
sampler2D mySampler = sampler_state
{
Texture=<SomeTexture>;
Filter=POINT;
};
Point sampling should prevent the GPU from interpolating when sampling the image, which is probably what's causing your antialias / blur behavior.
If you're just using SpriteBatch
to draw the image, you can set the filter using:
Device.SamplerStates[0] = SamplerState.PointClamp;
Also, it seems that you may have to set the SpriteBatch to use Immediate mode. See this article on MSDN for more information, or this thread on the App Hub forums.
Here's an earlier SO thread that'll probably be helpful:
See this thread for more information.
Textures are sampled using normalized coordinates (0..1, 0..1) rather than texel coordinates. The GPU will find the four closest texels for a given texture coordinate, and interpolate between them based on the position of the sample point within that square.
So, if I have a texture that is 10 x 10 pixels, and I attempt to read from, say, (0.15, 0.15), the GPU will interpolate between the texels at (1,1),(2,1),(1,2) and (2,2). In this case, the 0.05 should cause the resulting pixel on screen to be simply the average of the four surrounding pixels. However, if the texture were sampled at (0.19, 0.19), the resulting color would be heavily biased towards the texel at (2,2).
Point sampling will cause the GPU to always read an exact color from the underlying texture instead of weighting coordinates around the sample area.
Here's a working .Draw() method that illustrates all of this:
protected override void Draw(GameTime gameTime)
{
GraphicsDevice.Clear(Color.Black);
var largeRect = new Rectangle(50, 50, sprite.Width * 3, sprite.Height * 3);
/// start the batch, but in immediate mode to avoid antialiasing
spriteBatch.Begin(SpriteSortMode.Immediate, BlendState.Opaque);
/// set the filter to Point
GraphicsDevice.SamplerStates[0] = SamplerState.PointClamp;
/// draw the sprite
spriteBatch.Draw(sprite, largeRect, Color.White);
/// done!
spriteBatch.End();
// TODO: Add your drawing code here
base.Draw(gameTime);
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With