Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Convert OpenGL ES 2.0 rendered texture to bitmap and back

I'd like to blur the rendered texture with RenderScript and for it I need to convert it to bitmap format and to use it I need to convert it back to OpenGL texture.

The render to texture is working. The problem has to be somewhere here but I don't understand why it doesn't work. I'm getting a black screen

  public void renderToTexture(){
    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fb[0]);
    GLES20.glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
    GLES20.glClear( GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);

    // specify texture as color attachment
    GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, renderTex[0], 0);
    // attach render buffer as depth buffer
    GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, depthRb[0]);
    // check status
    int status = GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER);
    drawRender();

   Bitmap bitmap = SavePixels(0,0,texW,texH);
   //blur bitmap and get back a bluredBitmap not yet implemented
   texture = TextureHelper.loadTexture(bluredBitmap, 128);

   GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
   GLES20.glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
   GLES20.glClear( GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);

    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texture);

    drawRender2();

}

To create a bitmap I read pixels from the framebuffer because didn't find any other way to do it but I'm open for other methods

    public static Bitmap SavePixels(int x, int y, int w, int h)
{
    int b[]=new int[w*(y+h)];
    int bt[]=new int[w*h];
    IntBuffer ib=IntBuffer.wrap(b);
    ib.position(0);
    GLES20.glReadPixels(0, 0, w, h, GLES20.GL_RGB, GLES20.GL_UNSIGNED_BYTE, ib);

    for(int i=0, k=0; i<h; i++, k++)
    {          
        for(int j=0; j<w; j++)
        {
            int pix=b[i*w+j];
            int pb=(pix>>16)&0xff;
            int pr=(pix<<16)&0x00ff0000;
            int pix1=(pix&0xff00ff00) | pr | pb;
            bt[(h-k-1)*w+j]=pix1;
        }
    }


    Bitmap sb=Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);
    return sb;
}

Here is the bitmap to texture code:

    public static int loadTexture(final Bitmap pics, int size)
{
    final int[] textureHandle = new int[1];

    GLES20.glGenTextures(1, textureHandle, 0);

    if (textureHandle[0] != 0)
    {
        // Read in the resource
        final Bitmap bitmap = pics;

        GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA);
        GLES20.glEnable(GLES20.GL_BLEND);

        // Bind to the texture in OpenGL
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);

        // Set filtering
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);

        // Load the bitmap into the bound texture.
         GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);

        // Recycle the bitmap, since its data has been loaded into OpenGL.
        bitmap.recycle();
    }

    if (textureHandle[0] == 0)
    {
        throw new RuntimeException("Error loading texture.");
    }

    return textureHandle[0];
}
like image 470
FaNaT Avatar asked Aug 17 '14 11:08

FaNaT


People also ask

How do I bind a texture to an image using OpenGL?

With textures we use the command glBindTexture: From this point all commands we call on regarding textures will be applied on to your texture with the generated id. There is a couple of parameters we need to set on the texture, the first one is to tell OpenGL what to do if the texture need to be shrunk or magnified to match the rendered image.

How to map an image to a mesh using OpenGL?

We will also need to tell OpenGL how to map this image onto the mesh this is done in two steps, fist we need to assign UV coordinates UV mapping is the way we map the pixels on the bitmap to the vertices in our mesh. The UV coordinates are 0,0 in the upper left and 1,1 is the bottom right, like the left image below.

How do you map a texture to a vertex?

To get the texture mapped correctly we need to map the lower left part of the texture (0,1) to the lower left vertex (0) in our plane and we need to map the the bottom right (1,1) in the texture to the bottom right (1) to the bottom right in our plane and… you get the idea.


1 Answers

You can see Android MediaCodec stuff, also can directly see ExtractMpegFramesTest_egl14.java, and the code snippet is here:

    [/**
     * Saves][1] the current frame to disk as a PNG image.
     */
    public void saveFrame(String filename) throws IOException {
        // glReadPixels gives us a ByteBuffer filled with what is essentially big-endian RGBA
        // data (i.e. a byte of red, followed by a byte of green...).  To use the Bitmap
        // constructor that takes an int[] array with pixel data, we need an int[] filled
        // with little-endian ARGB data.
        //
        // If we implement this as a series of buf.get() calls, we can spend 2.5 seconds just
        // copying data around for a 720p frame.  It's better to do a bulk get() and then
        // rearrange the data in memory.  (For comparison, the PNG compress takes about 500ms
        // for a trivial frame.)
        //
        // So... we set the ByteBuffer to little-endian, which should turn the bulk IntBuffer
        // get() into a straight memcpy on most Android devices.  Our ints will hold ABGR data.
        // Swapping B and R gives us ARGB.  We need about 30ms for the bulk get(), and another
        // 270ms for the color swap.
        //
        // We can avoid the costly B/R swap here if we do it in the fragment shader (see
        // http://stackoverflow.com/questions/21634450/ ).
        //
        // Having said all that... it turns out that the Bitmap#copyPixelsFromBuffer()
        // method wants RGBA pixels, not ARGB, so if we create an empty bitmap and then
        // copy pixel data in we can avoid the swap issue entirely, and just copy straight
        // into the Bitmap from the ByteBuffer.
        //
        // Making this even more interesting is the upside-down nature of GL, which means
        // our output will look upside-down relative to what appears on screen if the
        // typical GL conventions are used.  (For ExtractMpegFrameTest, we avoid the issue
        // by inverting the frame when we render it.)
        //
        // Allocating large buffers is expensive, so we really want mPixelBuf to be
        // allocated ahead of time if possible.  We still get some allocations from the
        // Bitmap / PNG creation.

        mPixelBuf.rewind();
        GLES20.glReadPixels(0, 0, mWidth, mHeight, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE,
            mPixelBuf);

        BufferedOutputStream bos = null;
        try {
            bos = new BufferedOutputStream(new FileOutputStream(filename));
            Bitmap bmp = Bitmap.createBitmap(mWidth, mHeight, Bitmap.Config.ARGB_8888);
            mPixelBuf.rewind();
            bmp.copyPixelsFromBuffer(mPixelBuf);
            bmp.compress(Bitmap.CompressFormat.PNG, 90, bos);
            bmp.recycle();
        } finally {
            if (bos != null) bos.close();
        }
        if (VERBOSE) {
            Log.d(TAG, "Saved " + mWidth + "x" + mHeight + " frame as '" + filename + "'");
        }
    }
like image 172
Iven Zhang Avatar answered Sep 16 '22 21:09

Iven Zhang