Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it possible to do old school 2d blitting on modern GPU?

It looks like GL has become mainstream for all gaming platforms (even handheld!) This has pushed the deployment of modern GPU chipsets to large numbers of consumers.

This is amazing.

With the modern GPU systems out there now, is it possible to do generic old-school graphics programming (aka - blit from X rect to Y rect using VRAM)? (Think Amiga) Or are the operations centered around vertex and pixel shaders?

Is this accessable from GL? OpenGL ES?

Rendering a textured quad is OK, but it would require double buffering and a re-render of the entire scene. Was seeing if I could avoid this.

like image 350
drudru Avatar asked Feb 28 '10 09:02

drudru


3 Answers

Check for glBlitFramebuffer routine (Framebuffer Object). You need an updated driver.

Keep in mind you can still use the default framebuffer, but I think it will be more funny using framebuffer objects.

Keep your sprite in separate frambuffers (maybe rendered using OpenGL), and set them as read (using glReadBuffers) and blit them on the draw framebuffer (using glDrawBuffers). It's quite simple and fast.

like image 145
Luca Avatar answered Nov 07 '22 06:11

Luca


Well, you can use libSDL and get a pointer to the screen framebuffer and do whatever you want with the pixels. Or you cand do all your drawing to a memory buffer, load to a GL texture and draw textured quads which probably it's faster because of hardware acceleration.

like image 28
Daniele Santi Avatar answered Nov 07 '22 07:11

Daniele Santi


It may be possible on some embedded systems to get a framebuffer pointer and write to it directly, but these days you're better off using OpenGL|ES and rendering a texture. It will be more portable, and probably faster.

You could create a buffer in main memory, do all the bit twiddling you want, and then render it as a texture. You can DMA your texture data to VRAM for speed, and then render it in a quad, which is equivalent to a blit, but doesn't use any CPU cycles and runs as fast as the GPU can process.

It's amazing what you can do with shaders and programmable pipelines these days.

like image 35
gavinb Avatar answered Nov 07 '22 07:11

gavinb