我正在尝试开发一个应用程序,允许我在录制视频时绘制视频,然后将录制文件和视频保存在一个 mp4 文件中以供以后使用。另外,我想使用 camera2 库,尤其是我需要我的应用程序在高于 API 21 的设备上运行,而且我总是避免使用已弃用的库。
我尝试了很多方法来做到这一点,包括 FFmpeg,我在其中放置了 TextureView.getBitmap()(来自相机)的叠加层和从画布上获取的位图。它有效,但由于它是一个慢速功能,视频无法捕捉到足够的帧(甚至 25 fps),而且运行速度非常快。我也希望包含音频。
我考虑了 MediaProjection 库,但我不确定它是否可以仅在其 VirtualDisplay 内捕获包含相机和绘图的布局,因为应用程序用户也可以在视频上添加文本,而我不想要键盘出现。
请帮忙,这是一个星期的研究,我发现没有什么对我有用。
PS:如果在用户按下“停止录制”按钮之后包含一点处理时间,我没有问题。
编辑:
现在在 Eddy 的回答之后,我正在使用 shadercam 应用程序在相机表面上绘图,因为该应用程序会进行视频渲染,而要做的解决方法是将我的画布渲染为位图,然后渲染为 GL 纹理,但是我无法成功地做到这一点。我需要你们的帮助,我需要完成应用程序:S
我正在使用 shadercam 库(https://github.com/googlecreativelab/shadercam),并用以下代码替换了“ExampleRenderer”文件:
public class WriteDrawRenderer extends CameraRenderer
{
private float offsetR = 1f;
private float offsetG = 1f;
private float offsetB = 1f;
private float touchX = 1000000000;
private float touchY = 1000000000;
private Bitmap textBitmap;
private int textureId;
private boolean isFirstTime = true;
//creates a new canvas that will draw into a bitmap instead of rendering into the screen
private Canvas bitmapCanvas;
/**
* By not modifying anything, our default shaders will be used in the assets folder of shadercam.
*
* Base all shaders off those, since there are some default uniforms/textures that will
* be passed every time for the camera coordinates and texture coordinates
*/
public WriteDrawRenderer(Context context, SurfaceTexture previewSurface, int width, int height)
{
super(context, previewSurface, width, height, "touchcolor.frag.glsl", "touchcolor.vert.glsl");
//other setup if need be done here
}
/**
* we override {@link #setUniformsAndAttribs()} and make sure to call the super so we can add
* our own uniforms to our shaders here. CameraRenderer handles the rest for us automatically
*/
@Override
protected void setUniformsAndAttribs()
{
super.setUniformsAndAttribs();
int offsetRLoc = GLES20.glGetUniformLocation(mCameraShaderProgram, "offsetR");
int offsetGLoc = GLES20.glGetUniformLocation(mCameraShaderProgram, "offsetG");
int offsetBLoc = GLES20.glGetUniformLocation(mCameraShaderProgram, "offsetB");
GLES20.glUniform1f(offsetRLoc, offsetR);
GLES20.glUniform1f(offsetGLoc, offsetG);
GLES20.glUniform1f(offsetBLoc, offsetB);
if (touchX < 1000000000 && touchY < 1000000000)
{
//creates a Paint object
Paint yellowPaint = new Paint();
//makes it yellow
yellowPaint.setColor(Color.YELLOW);
//sets the anti-aliasing for texts
yellowPaint.setAntiAlias(true);
yellowPaint.setTextSize(70);
if (isFirstTime)
{
textBitmap = Bitmap.createBitmap(mSurfaceWidth, mSurfaceHeight, Bitmap.Config.ARGB_8888);
bitmapCanvas = new Canvas(textBitmap);
}
bitmapCanvas.drawText("Test Text", touchX, touchY, yellowPaint);
if (isFirstTime)
{
textureId = addTexture(textBitmap, "textBitmap");
isFirstTime = false;
}
else
{
updateTexture(textureId, textBitmap);
}
touchX = 1000000000;
touchY = 1000000000;
}
}
/**
* take touch points on that textureview and turn them into multipliers for the color channels
* of our shader, simple, yet effective way to illustrate how easy it is to integrate app
* interaction into our glsl shaders
* @param rawX raw x on screen
* @param rawY raw y on screen
*/
public void setTouchPoint(float rawX, float rawY)
{
this.touchX = rawX;
this.touchY = rawY;
}
}
请帮助大家,已经一个月了,我仍然坚持使用同一个应用程序:(并且对opengl一无所知。两周后,我正在尝试将这个项目用于我的应用程序,并且视频上没有呈现任何内容。
提前致谢!