Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

GLSL Renderbuffer really required?

Tags:

glsl

I am trying to write a program that writes video camera frames into a quad. I saw tutorials explaining that with framebuffers can be faster, but still learning how to do it. But then besides the framebuffer, I found that there is also renderbuffers.

The question is, if the purpose is only to write a texture into a quad that will fill up the screen, do I really need a renderbuffer?

I understand that renderbuffers are for depth testing, which I think is only for checking Z position of the pixel, therefore would be silly to have to create a render buffer for my scenario, correct?

like image 381
PerracoLabs Avatar asked Mar 24 '12 10:03

PerracoLabs


People also ask

When should I use Renderbuffer?

Renderbuffer Objects are OpenGL Objects that contain images. They are created and used specifically with Framebuffer Objects. They are optimized for use as render targets, while Textures may not be, and are the logical choice when you do not need to sample (i.e. in a post-pass shader) from the produced image.

What is a use of framebuffer?

A framebuffer (frame buffer, or sometimes framestore) is a portion of random-access memory (RAM) containing a bitmap that drives a video display. It is a memory buffer containing data representing all the pixels in a complete video frame. Modern video cards contain framebuffer circuitry in their cores.

What is framebuffer in OpenGL?

A Framebuffer is a collection of buffers that can be used as the destination for rendering. OpenGL has two kinds of framebuffers: the Default Framebuffer, which is provided by the OpenGL Context; and user-created framebuffers called Framebuffer Objects (FBOs).

What is a buffer in rendering?

The term "frame buffer" traditionally refers to the region of memory that holds the color data for the image displayed on a computer screen. In WebGL, a framebuffer is a data structure that organizes the memory resources that are needed to render an image.


1 Answers

A framebuffer object is a place to stick images so that you can render to them. Color buffers, depth buffers, etc all go into a framebuffer object.

A renderbuffer is like a texture, but with two important differences:

  1. It is always 2D and has no mipmaps. So it's always exactly 1 image.
  2. You cannot read from a renderbuffer. You can attach them to an FBO and render to them, but you can't sample from them with a texture access or something.

So you're talking about two mostly separate concepts. Renderbuffers do not have to be "for depth testing." That is a common use case for renderbuffers, because if you're rendering the colors to a texture, you usually don't care about the depth. You need a depth because you need depth testing for hidden-surface removal. But you don't need to sample from that depth. So instead of making a depth texture, you make a depth renderbuffer.

But renderbuffers can also use colors rather than depth formats. You just can't attach them as textures. You can still blit from/to them, and you can still read them back with glReadPixels. You just can't read from them in a shader.

Oddly enough, this does nothing to answer your question:

The question is, if the purpose is only to write a texture into a quad that will fill up the screen, do I really need a renderbuffer?

I don't see why you need a framebuffer or a renderbuffer of any kind. A texture is a texture; just draw a textured quad.

like image 185
Nicol Bolas Avatar answered Oct 07 '22 01:10

Nicol Bolas