Apple has a couple of examples for iOS where they convert from YUV420 planar data to RGBA in an OpenGL ES 2.0 shader. While not specifically for Android, you should still be able to use this GLSL shader code to accomplish what you want.
This is what I use in a conversion fragment shader based on their example:
varying highp vec2 textureCoordinate;
uniform sampler2D luminanceTexture;
uniform sampler2D chrominanceTexture;
void main()
{
mediump vec3 yuv;
lowp vec3 rgb;
yuv.x = texture2D(luminanceTexture, textureCoordinate).r;
yuv.yz = texture2D(chrominanceTexture, textureCoordinate).rg - vec2(0.5, 0.5);
// BT.601, which is the standard for SDTV is provided as a reference
/*
rgb = mat3( 1, 1, 1,
0, -.39465, 2.03211,
1.13983, -.58060, 0) * yuv;
*/
// Using BT.709 which is the standard for HDTV
rgb = mat3( 1, 1, 1,
0, -.21482, 2.12798,
1.28033, -.38059, 0) * yuv;
gl_FragColor = vec4(rgb, 1);
}
luminanceTexture
is the Y plane of your image as a texture, and chrominanceTexture
is the UV plane.
I believe the above is tuned for video range YUV, so you may need to adjust these values for full range YUV. This shader runs in a fraction of a millisecond for 1080p video frames on an iPhone 4S.