3

与当前的 iOS SDK 一样,无法从视频中获取压缩帧,我们如何像 Skype 那样实现实时流媒体?我现在看到的唯一方法是:

  1. 从 AVCaptureVideoDataOutput 获取未压缩的帧
  2. 使用第三方库压缩这些帧
  3. 向服务器发送帧

还有其他方法可以完成这项任务吗?哪些库可用于压缩,它们是否与 appstore 兼容?提前致谢

4

1 回答 1

3

I'm struggling with this, too. The ffmpeg library seems like it works for compression, but the licensing means you have to release your source code.

You can set your object as a delegate from AVCaptureVideoDataOutput and implement this callback on a dispatch queue:

  • (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;

Then you'll get the uncompressed video which you can process into a uiimage or jpeg (Apple has code samples for this), but there's no way to get the hardware compressed H264 frames, which is what we really want. This is where you could implement a library like ffmpeg to compress the video into H264 or whatever.

Currently, I'm trying to see if I can interpret the AVAssetWriter file output and redirect that to a stream (it can write hardware-compressed video), but Apple seems to be making this hard for some reason.

Let me know if you find something that works.

于 2011-11-28T21:52:41.030 回答