8

已经演示了如何为MediaCodec 提供像 CameraPreview 这样的 Surface 输入,但是在提交之前是否有实用的方法来缓冲这个输入MediaCodec

在我的实验中,Galaxy Nexus 在使用CameraToMpegTest.java中的直接同步编码方法生成音频/视频流时遇到了不可接受的问题

当使用MediaCodecwithbyte[]ByteBufferinput 时,我们可以将未编码的数据提交到一个ExecutorService或类似的队列进行处理,以确保没有丢帧,即使设备出现超出我们应用程序控制的 CPU 使用高峰。但是,由于Android的Camera和MediaCodec之间需要进行颜色格式转换,这种方法对于高分辨率的直播视频来说是不现实的。

想法

  1. 有没有办法用 to 喂养创建NativePixmapType的?EGL14.eglCopyBuffers(EGLDisplay d, EGLSurface s, NativePixmapType p)MediaCodec

  2. Android 中的任何人都可以评论协调相机和 MediaCodec 之间的 ByteBuffer 格式是否在路线图上?

4

1 回答 1

5

You really don't want to copy the data at all. Allocating storage for and copying a large chunk of data can take long enough to kill your frame rate. This generally rules out byte[] and ByteBuffer[] solutions, even if you didn't have to do a U/V plane swap.

The most efficient way to move data through the system is with a Surface. The trick is that a Surface isn't a buffer, it's an interface to a queue of buffers. The buffers are passed around by reference; when you unlockCanvasAndPost() you're actually placing the current buffer onto a queue for the consumer, which is often in a different process.

There is no public mechanism for creating a new buffer and adding it to the set used by the queue, or for extracting buffers from the queue, so you can't implement a DIY buffering scheme on the side. There's no public interface to change the number of buffers in the pool.

It'd be useful to know what it is that's causing the hiccups. The Android tool for analyzing such issues is systrace, available in Android 4.1+ (docs, example, bigflake example). If you can identify the source of the CPU load, or determine that it's not CPU but rather some bit of code getting tangled up, you'll likely have a solution that's much easier than adding more buffers to Surface.

于 2013-10-08T23:09:56.937 回答