Play video inside OpenGL ES by using MediaCodec
In Android 4.1 (API 16) and above, MediaCodec is introduced by Android. By using MediaCodec, we can easily decode video without using Android NDK, like creating video playback application, etc.
In game development, video is also needed to make the game environment more realistics, like when you are playing a racing game with video banner around the track.
In here, I'll introduce of how to decode the video and render it into OpenGL ES.
Firstly, we prepare some stuffs to decode the video, like MediaExtractor and MediaCodec.
References: (Thanks to Ray)
https://github.com/crossle/MediaPlayerSurface/blob/master/src/me/crossle/demo/surfacetexture/VideoSurfaceView.java
https://vec.io/posts/android-hardware-decoding-with-mediacodec
In game development, video is also needed to make the game environment more realistics, like when you are playing a racing game with video banner around the track.
In here, I'll introduce of how to decode the video and render it into OpenGL ES.
Firstly, we prepare some stuffs to decode the video, like MediaExtractor and MediaCodec.
private boolean initExtractor() { extractor = new MediaExtractor(); try { extractor.setDataSource(mFilePath); } catch (IOException e) { return false; } // get video track for (int i = 0; i < extractor.getTrackCount(); i++) { MediaFormat format = extractor.getTrackFormat(i); String mime = format.getString(MediaFormat.KEY_MIME); if (mime.startsWith("video/")) { tracknumb = i; break; } } if (tracknumb == -1) { Log.e("DecodeActivity", "Can't find video track!"); return false; } // set track to extractor extractor.selectTrack(tracknumb); return true; } private boolean initDecoder(Surface surface) { // get mimetype and format MediaFormat format = extractor.getTrackFormat(tracknumb); String mime = format.getString(MediaFormat.KEY_MIME); decoder = MediaCodec.createDecoderByType(mime); decoder.configure(format, surface, null, 0); if (decoder == null) { Log.e("DecodeActivity", "Can't find video info!"); return false; } decoder.start(); return true; }Later, inside the OpenGL ES, prepare that surface to MediaCodec.
@Override public void onSurfaceCreated(GL10 gl, EGLConfig config) { final String fragmentShaderSourceOES = "#extension GL_OES_EGL_image_external : require\n" + "precision mediump float;\n" + "varying vec2 vTextureCoord;\n" + "uniform samplerExternalOES sTexture;\n" + "void main() {\n" + " vec4 color = texture2D(sTexture, vTextureCoord);\n" + " gl_FragColor = color;\n" + "}\n"; // Prepare your shader program here mVertexShader = ... mPixelShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderSourceOES); // use fragment shader that support OES mProgram = createProgram(mVertexShader, mPixelShader); // Prepare texture handler int[] textures = new int[1]; GLES20.glGenTextures(1, textures, 0); mTextureID = textures[0]; GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTextureID); GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST); GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE); GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE); // Link the texture handler to surface texture mSurfaceTexture = new SurfaceTexture(mTextureID); mSurfaceTexture.setDefaultBufferSize(320, 240); mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() { @Override public void onFrameAvailable(SurfaceTexture surfaceTexture) { synchronized(updateSurface) { updateSurface = true; } } }); // Create decoder surface mDecoderSurface = new Surface(mSurfaceTexture); } @Override public void onDrawFrame(GL10 gl) { ... synchronized(updateSurface) { if (updateSurface) { mSurfaceTexture.updateTexImage(); // update surfacetexture if available updateSurface = false; } } // use program GLES20.glUseProgram(mProgram); CLGLUtility.checkGlError(TAG, "glUseProgram"); // bind texture GLES20.glActiveTexture(GLES20.GL_TEXTURE0); GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureID); // set vertices position mShape.setPositionArray(maPositionHandle); // set vertices texture coordinate mShape.setTexCoordArray(maTextureHandle); // draw shape mShape.drawArrays(); }And set it to Media Codec.
... initDecoder(mDecoderSurface); ...Finally, OpenGL ES and MediaCodec are linked together and we can start the game.
@Override public void onSurfaceCreated(GL10 gl, EGLConfig config) { ... startDecode(); } public void startDecode() { // get buffers decoderInputBuffers = decoder.getInputBuffers(); decoderOutputBuffers = decoder.getOutputBuffers(); // start getting buffer BufferInfo info = new BufferInfo(); boolean isEOS = false; long startMs = System.currentTimeMillis(); Log.d("DecodeActivity", "BufferInfo: size:"+info.size); while (!threadIterrupted) { // get input buffer (decoder) if (!isEOS) { isEOS = readDecoderBuffer(); } isEOS = checkDecoderBuffer(info, startMs); if (isEOS) break; } decoder.stop(); decoder.release(); extractor.release(); } private boolean readDecoderBuffer() { int inIndex = decoder.dequeueInputBuffer(10000); // index did not get correctly if (inIndex < 0) return true; ByteBuffer buffer = decoderInputBuffers[inIndex]; int sampleSize = extractor.readSampleData(buffer, 0); if (sampleSize < 0) { // We shouldn't stop the playback at this point, just pass the EOS // flag to decoder, we will get it again from the dequeueOutputBuffer Log.d("DecodeActivity", "InputBuffer BUFFER_FLAG_END_OF_STREAM"); decoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM); return true; } else { decoder.queueInputBuffer(inIndex, 0, sampleSize, extractor.getSampleTime(), 0); extractor.advance(); } return false; } private boolean checkDecoderBuffer(BufferInfo info, long startMs) { // get output buffer, to control the time int outIndex = decoder.dequeueOutputBuffer(info, 10000); Log.i(TAG , "BufferInfo: size:"+info.size+" presentationTimeUs:"+info.presentationTimeUs+" offset:"+info.offset+" flags:"+info.flags); switch (outIndex) { case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED: Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED"); decoderOutputBuffers = decoder.getOutputBuffers(); break; case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED: Log.d("DecodeActivity", "New format " + decoder.getOutputFormat()); break; case MediaCodec.INFO_TRY_AGAIN_LATER: Log.d("DecodeActivity", "dequeueOutputBuffer timed out!"); break; default: ByteBuffer buffer = decoderOutputBuffers[outIndex]; Log.v("DecodeActivity", "We can't use this buffer but render it due to the API limit, " + buffer); // We use a very simple clock to keep the video FPS, or the video playback will be too fast while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) { try { sleep(10); } catch (InterruptedException e) { return false; } } decoder.releaseOutputBuffer(outIndex, true); break; } // All decoded frames have been rendered, we can stop playing now if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) { Log.d("DecodeActivity", "OutputBuffer BUFFER_FLAG_END_OF_STREAM"); return true; } return false; }
References: (Thanks to Ray)
https://github.com/crossle/MediaPlayerSurface/blob/master/src/me/crossle/demo/surfacetexture/VideoSurfaceView.java
https://vec.io/posts/android-hardware-decoding-with-mediacodec
Hello, I'm interested in working on this post.
ReplyDeleteFirst, I Appreciate you sharing such a good informations.
but I want to know what 'mShape' instance is.
Hi there, mShape is a class to manipulate one 3D object like plane, triangle, etc.
DeleteHey Fendy, this looks awesome. Do you have an example Android project by chance? Thanks! Mark
ReplyDeleteHi Mark, the references site are added. Thanks
DeleteHi there,
ReplyDeleteLove this example that you have using OpenGL to map each frame to a texture. However, for anyone who wants to use this code, I have two comments:
(1) You seem to have left out the Vertex Shader code... which is fundamental because every time you make a call to updateTexImage, you must find the ST transformation matrix and you must pass this as a uniform into your Vertex Shader code. This matrix needs to transform the texture co-ordinates so that it can properly access the right locations in the texture. A great example of doing this can be found here: https://github.com/crossle/MediaPlayerSurface/blob/master/src/me/crossle/demo/surfacetexture/VideoSurfaceView.java
(2) I see that you invoked the startDecode() method inside the onSurfaceCreated... but by doing this, the while loop will never be able to exit the onSurfaceCreated method, which won't be able to exit the method, and those no drawing will be able to be performed. It is highly recommended that this method be placed on a separate thread... creating a class that implements the Runnable interface and placing startDecode() within the run() method.
BTW, you should probably cite where you grabbed the original code of this from. This is a modified implementation taken from Cedric Fung's website: https://vec.io/posts/android-hardware-decoding-with-mediacodec, created on January 11, 2013.
Either way, great work!
- Ray.
Hi Ray,
DeleteIn this site, I just explained how to use MediaCodec on OpenGL ES.
Anyway, the referenced code is added into this blog.
Thanks
Fendy
Hi Fendy,
DeleteI hope I didn't come out as condescending or critical. I really found this tutorial to be helpful as I needed it for the application I am developing.
Thank you again for making this available, and by preventing me from jumping through so many hoops to figure this out!
- Ray.
Hi Ray,
DeleteGlad that this blog is helpful.
Please give me more information if I miss something.
Fendy
Thank you so much. This was incredibly helpful.
ReplyDelete