我一直在尝试在 MonoTouch 中进行一些实时视频图像处理。我正在使用 AVCaptureSession 从与 AVCaptureVideoPreviewLayer 一起使用的相机中获取帧。
我还在委托类中成功获取了回调方法“DidOutputSampleBuffer”。但是,我尝试从生成的 CMSampleBuffer 创建 UIImage 的每种方式都失败了。
这是我设置捕获会话的代码:
captureSession = new AVCaptureSession ();
captureSession.BeginConfiguration ();
videoCamera = AVCaptureDevice.DefaultDeviceWithMediaType (AVMediaType.Video);
if (videoCamera != null)
{
captureSession.SessionPreset = AVCaptureSession.Preset1280x720;
videoInput = AVCaptureDeviceInput.FromDevice (videoCamera);
if (videoInput != null)
captureSession.AddInput (videoInput);
//DispatchQueue queue = new DispatchQueue ("videoFrameQueue");
videoCapDelegate = new videoOutputDelegate (this);
DispatchQueue queue = new DispatchQueue("videoFrameQueue");
videoOutput = new AVCaptureVideoDataOutput ();
videoOutput.SetSampleBufferDelegateAndQueue (videoCapDelegate, queue);
videoOutput.AlwaysDiscardsLateVideoFrames = true;
videoOutput.VideoSettings.PixelFormat = CVPixelFormatType.CV24RGB;
captureSession.AddOutput (videoOutput);
videoOutput.ConnectionFromMediaType(AVMediaType.Video).VideoOrientation = AVCaptureVideoOrientation.Portrait;
previewLayer = AVCaptureVideoPreviewLayer.FromSession (captureSession);
previewLayer.Frame = UIScreen.MainScreen.Bounds;
previewLayer.AffineTransform = CGAffineTransform.MakeRotation (Convert.DegToRad (-90));
//this.View.Layer.AddSublayer (previewLayer);
captureSession.CommitConfiguration ();
captureSession.StartRunning ();
}
我尝试从样本缓冲区的图像缓冲区投射的 CVPixelBuffer 创建一个 CGBitmapContext,如下所示:
public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, MonoTouch.CoreMedia.CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
{
CVPixelBuffer pixelBuffer = sampleBuffer.GetImageBuffer () as CVPixelBuffer;
CVReturn flag = pixelBuffer.Lock (0);
if(flag == CVReturn.Success)
{
CGBitmapContext context = new CGBitmapContext
(
pixelBuffer.BaseAddress,
pixelBuffer.Width,
pixelBuffer.Height,
8,
pixelBuffer.BytesPerRow,
CGColorSpace.CreateDeviceRGB (),
CGImageAlphaInfo.PremultipliedFirst
);
UIImage image = new UIImage(context.ToImage());
ProcessImage (image);
pixelBuffer.Unlock(0);
}else
Debug.Print(flag.ToString()
sampleBuffer.Dispose();
}
这会导致以下错误
<Error>: CGBitmapContextCreate: invalid data bytes/row: should be at least 2880 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.
即使对参数进行了一些调整,我也会在本机 Objective-c 中得到无效的句柄异常或段错误。
我也尝试过简单地使用 CVImageBuffer 创建一个 CIImage 并从中创建一个 UIImage ,如下所示:
public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, MonoTouch.CoreMedia.CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
{
CIImage cImage = new CIImage(sampleBuffer.GetImageBuffer ());
UIImage image = new UIImage(cImage);
ProcessImage (image);
sampleBuffer.Dispose();
}
这会导致初始化 CIImage 时出现异常:
NSInvalidArgumentException Reason: -[CIImage initWithCVImageBuffer:]: unrecognized selector sent to instance 0xc821d0
老实说,这感觉像是 MonoTouch 的某种错误,但如果我遗漏了什么或者只是想以一种奇怪的方式做到这一点,请让我知道一些替代解决方案。
谢谢