Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

OpenGL drawing on Android combining with Unity to transfer texture through frame buffer cannot work

I'm currently making an Android player plugin for Unity. The basic idea is that I will play the video by MediaPlayer on Android, which provides a setSurface API receiving a SurfaceTexture as constructor parameter and in the end binds with an OpenGL-ES texture. In most other cases like showing an image, we can just send this texture in form of pointer/id to Unity, call Texture2D.CreateExternalTexture there to generate a Texture2D object and set that to an UI GameObject to render the picture. However, when it comes to displaying video frames, it's a little bit different since video playing on Android requires a texture of type GL_TEXTURE_EXTERNAL_OES while Unity only supports the universal type GL_TEXTURE_2D.

To solve the problem, I've googled for a while and known that I should adopt a kind of technology called "Render to texture". More clear to say, I should generate 2 textures, one for the MediaPlayer and SurfaceTexture in Android to receive video frames and another for Unity that should also has the picture data inside. The first one should be in type of GL_TEXTURE_EXTERNAL_OES (let's call it OES texture for short) and the second one in type of GL_TEXTURE_2D (let's call it 2D texture). Both of these generated textures are empty in the beginning. When bound with MediaPlayer, the OES texture will be updated during video playing, then we can use a FrameBuffer to draw the content of OES texture upon the 2D texture.

I've written a pure-Android version of this process and it works pretty well when I finally draw the 2D texture upon the screen. However, when I publish it as an Unity Android plugin and runs the same code on Unity, there won't be any pictures showing. Instead, it only displays a preset color from glClearColor, which means two things:

  1. The transferring process of OES texture -> FrameBuffer -> 2D texture is complete and Unity do receive the final 2D texture. Because the glClearColor is called only when we draw the content of OES texture to FrameBuffer.
  2. There are some mistakes during drawing happened after glClearColor, because we don't see the video frames pictures. In fact, I also call glReadPixels after drawing and before unbinding with the FrameBuffer, which is going to read data from the FrameBuffer we bound with. And it returns the single color's value that is same with the color we set in glClearColor.

In order to simplify the code I should provide here, I'm going to draw a triangle to a 2D texture through FrameBuffer. If we can figure out which part is wrong, we then can easily solve the similar problem to draw video frames.

The function will be called on Unity:

  public int displayTriangle() {
    Texture2D texture = new Texture2D(UnityPlayer.currentActivity);
    texture.init();

    Triangle triangle = new Triangle(UnityPlayer.currentActivity);
    triangle.init();

    TextureTransfer textureTransfer = new TextureTransfer();
    textureTransfer.tryToCreateFBO();

    mTextureWidth = 960;
    mTextureHeight = 960;
    textureTransfer.tryToInitTempTexture2D(texture.getTextureID(), mTextureWidth, mTextureHeight);

    textureTransfer.fboStart();
    triangle.draw();
    textureTransfer.fboEnd();

    // Unity needs a native texture id to create its own Texture2D object
    return texture.getTextureID();
  }

Initialization of 2D texture:

  protected void initTexture() {
    int[] idContainer = new int[1];
    GLES30.glGenTextures(1, idContainer, 0);
    textureId = idContainer[0];
    Log.i(TAG, "texture2D generated: " + textureId);
    // texture.getTextureID() will return this textureId

    bindTexture();

    GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D,
        GLES30.GL_TEXTURE_MIN_FILTER, GLES30.GL_NEAREST);
    GLES30.glTexParameterf(GLES30.GL_TEXTURE_2D,
        GLES30.GL_TEXTURE_MAG_FILTER, GLES30.GL_LINEAR);
    GLES30.glTexParameteri(GLES30.GL_TEXTURE_2D,
        GLES30.GL_TEXTURE_WRAP_S, GLES30.GL_CLAMP_TO_EDGE);
    GLES30.glTexParameteri(GLES30.GL_TEXTURE_2D,
        GLES30.GL_TEXTURE_WRAP_T, GLES30.GL_CLAMP_TO_EDGE);

    unbindTexture();
  }

  public void bindTexture() {
    GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, textureId);
  }

  public void unbindTexture() {
    GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, 0);
  }

draw() of Triangle:

  public void draw() {
    float[] vertexData = new float[] {
        0.0f,  0.0f, 0.0f,
        1.0f, -1.0f, 0.0f,
        1.0f,  1.0f, 0.0f
    };
    vertexBuffer = ByteBuffer.allocateDirect(vertexData.length * 4)
        .order(ByteOrder.nativeOrder())
        .asFloatBuffer()
        .put(vertexData);
    vertexBuffer.position(0);

    GLES30.glClearColor(0.0f, 0.0f, 0.9f, 1.0f);
    GLES30.glClear(GLES30.GL_DEPTH_BUFFER_BIT | GLES30.GL_COLOR_BUFFER_BIT);
    GLES30.glUseProgram(mProgramId);

    vertexBuffer.position(0);
    GLES30.glEnableVertexAttribArray(aPosHandle);
    GLES30.glVertexAttribPointer(
        aPosHandle, 3, GLES30.GL_FLOAT, false, 12, vertexBuffer);

    GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 3);
  }

vertex shader of Triangle:

attribute vec4 aPosition;
void main() {
  gl_Position = aPosition;
}

fragment shader of Triangle:

precision mediump float;
void main() {
  gl_FragColor = vec4(0.9, 0.0, 0.0, 1.0);
}

Key code of TextureTransfer:

  public void tryToInitTempTexture2D(int texture2DId, int textureWidth, int textureHeight) {
    if (mTexture2DId != -1) {
      return;
    }

    mTexture2DId = texture2DId;
    GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, mTexture2DId);
    Log.i(TAG, "glBindTexture " + mTexture2DId + " to init for FBO");

    // make 2D texture empty
    GLES30.glTexImage2D(GLES30.GL_TEXTURE_2D, 0, GLES30.GL_RGBA, textureWidth, textureHeight, 0,
        GLES30.GL_RGBA, GLES30.GL_UNSIGNED_BYTE, null);
    Log.i(TAG, "glTexImage2D, textureWidth: " + textureWidth + ", textureHeight: " + textureHeight);

    GLES30.glBindTexture(GLES30.GL_TEXTURE_2D, 0);

    fboStart();
    GLES30.glFramebufferTexture2D(GLES30.GL_FRAMEBUFFER, GLES30.GL_COLOR_ATTACHMENT0,
        GLES30.GL_TEXTURE_2D, mTexture2DId, 0);
    Log.i(TAG, "glFramebufferTexture2D");
    int fboStatus = GLES30.glCheckFramebufferStatus(GLES30.GL_FRAMEBUFFER);
    Log.i(TAG, "fbo status: " + fboStatus);
    if (fboStatus != GLES30.GL_FRAMEBUFFER_COMPLETE) {
      throw new RuntimeException("framebuffer " + mFBOId + " incomplete!");
    }
    fboEnd();
  }

  public void fboStart() {
    GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, mFBOId);
  }

  public void fboEnd() {
    GLES30.glBindFramebuffer(GLES30.GL_FRAMEBUFFER, 0);
  }

And finally some code on Unity-side:

int textureId = plugin.Call<int>("displayTriangle");
Debug.Log("native textureId: " + textureId);
Texture2D triangleTexture = Texture2D.CreateExternalTexture(
  960, 960, TextureFormat.RGBA32, false, true, (IntPtr) textureId);
triangleTexture.UpdateExternalTexture(triangleTexture.GetNativeTexturePtr());
rawImage.texture = triangleTexture;
rawImage.color = Color.white;

Well, code above will not display the expected triangle but only a blue background. I add glGetError after nearly every OpenGL functions call while no errors are thrown.

My Unity version is 2017.2.1. For Android build, I shut down the experimental multithread rendering and other settings are all default(no texture compression, not use development build, so on). My app's minimum API level is 5.0 Lollipop and target API level is 9.0 Pie.

I really need some help, thanks in advance!

like image 216
ywwynm Avatar asked Aug 30 '18 03:08

ywwynm


1 Answers

Now I found the answer: If you want to do any drawing jobs in your plugin, you should do it at native layer. So if you want to make an Android plugin, you should call OpenGL-ES APIs at JNI instead of Java side. The reason is that Unity only allows drawing graphics on its rendering thread. If you simply call OpenGL-ES APIs like I did at Java side as in question description, they will actually run on Unity main thread instead of rendering thread. Unity provides a method, GL.IssuePluginEvent, to call your own functions on rendering thread but it needs native coding since this function requires a function pointer as its callback. Here is a simple example to use it:

At JNI side:

// you can copy these headers from https://github.com/googlevr/gvr-unity-sdk/tree/master/native_libs/video_plugin/src/main/jni/Unity
#include "IUnityInterface.h"
#include "UnityGraphics.h"

static void on_render_event(int event_type) {
  // do all of your jobs related to rendering, including initializing the context,
  // linking shaders, creating program, finding handles, drawing and so on
}

// UnityRenderingEvent is an alias of void(*)(int) defined in UnityGraphics.h
UnityRenderingEvent get_render_event_function() {
  UnityRenderingEvent ptr = on_render_event;
  return ptr;
}

// notice you should return a long value to Java side
extern "C" JNIEXPORT jlong JNICALL
Java_com_abc_xyz_YourPluginClass_getNativeRenderFunctionPointer(JNIEnv *env, jobject instance) {
  UnityRenderingEvent ptr = get_render_event_function();
  return (long) ptr;
}

At Android Java side:

class YourPluginClass {
  ...
  public native long getNativeRenderFunctionPointer();
  ...
}

At Unity side:

private void IssuePluginEvent(int pluginEventType) {
  long nativeRenderFuncPtr = Call_getNativeRenderFunctionPointer(); // call through plugin class
  IntPtr ptr = (IntPtr) nativeRenderFuncPtr;
  GL.IssuePluginEvent(ptr, pluginEventType); // pluginEventType is related to native function parameter event_type
}

void Start() {
  IssuePluginEvent(1); // let's assume 1 stands for initializing everything
  // get your texture2D id from plugin, create Texture2D object from it,
  // attach that to a GameObject, and start playing for the first time
}

void Update() {
  // call SurfaceTexture.updateTexImage in plugin
  IssuePluginEvent(2); // let's assume 2 stands for transferring TEXTURE_EXTERNAL_OES to TEXTURE_2D through FrameBuffer
  // call Texture2D.UpdateExternalTexture to update GameObject's appearance
}

You still need to transfer texture and everything about it should happen at JNI layer. But don't worry, they are nearly the same as I did in question description but only in a different language than Java and there are a lot of materials about this process so you can surely make it.

Finally let me address the key to solve this problem again: do your native stuff at native layer and don't be addicted to pure Java... I'm totally surprised that there are no blog/answer/wiki to tell us just write our code in C++. Although there are some open-source implementations like Google's gvr-unity-sdk, they give a complete reference but you'll still be doubt that maybe you can finish the task without writing any C++ code. Now we know that we can't. However, to be honest, I think Unity have the ability to make this progress even easier.

like image 117
ywwynm Avatar answered Oct 04 '22 20:10

ywwynm