0

在这里,我正在尝试录制视频和处理输出缓冲区,我正在使用 avfoundation 相机录制视频并以样本缓冲区委托方法收集输出并在图像视图中显示该输出。一切都可以找到,但是如何从这里保存视频可以任何人请帮助我。

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;{
  UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
  UIImage *fImage = [self addText:image text:@"u r text here"];//image will be processed   here
   self.dispFrames.image = fImage;

   }
4

2 回答 2

0

您必须使用音频/视频输入设置 AVAssetWriter 并附加您的示例缓冲区。您可以查看此示例代码,RosyWriterVideoProcessor 类 https://developer.apple.com/library/ios/samplecode/RosyWriter/Listings/Classes_RosyWriterVideoProcessor_m.html#//apple_ref/doc/uid/DTS40011110-Classes_RosyWriterVideoProcessor_m-DontLinkElementID_8

于 2013-09-03T03:23:20.453 回答
0

我用 Swift 5 AVFoundation 编写示例项目。一个控制器中的照片和摄像机。带有前后摄像头之间的切换器。通过此链接检查

项目的视频输出代码:

func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
    let showMediaController = ShowMediaController()
    showMediaController.url = outputFileURL
    navigationController?.pushViewController(showMediaController, animated: true)
}

在 ShowMediaConroller 我获取新视频的 url 并可以播放它:

@objc private func handlePlay() {
    if let url = url {
        player = AVPlayer(url: url)
        playerLayer = AVPlayerLayer(player: player)
        playerLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
        playerLayer?.frame = view.frame
        view.layer.insertSublayer(playerLayer!, at: 2)
        player?.play()
        playButton.isHidden = true
    }
}
于 2019-12-14T05:14:49.633 回答