我正在尝试使用AVFoundation
以下代码在框架中使用 iOS7 QR 读取功能:
-(void)setupCaptureSession_iOS7 {
self.session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input)
{
NSLog(@"Error: %@", error);
return;
}
[session addInput:input];
//Turn on point autofocus for middle of view
[device lockForConfiguration:&error];
CGPoint point = CGPointMake(0.5,0.5);
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
//Add the metadata output device
AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
NSLog(@"%lu",(unsigned long)output.availableMetadataObjectTypes.count);
for (NSString *s in output.availableMetadataObjectTypes)
NSLog(@"%@",s);
//You should check here to see if the session supports these types, if they aren't support you'll get an exception
output.metadataObjectTypes = @[AVMetadataObjectTypeEAN13Code, AVMetadataObjectTypeEAN8Code, AVMetadataObjectTypeUPCECode];
output.rectOfInterest = CGRectMake(0, 0, 320, 480);
[session startRunning];
// Assign session to an ivar.
[self setSession:self.session];
}
这段代码显然还没有将帧渲染到屏幕上。AVCaptureVideoPreviewLayer
这是因为,我需要将帧显示为 a UIImage
(这是因为我想在视图上多次显示帧),而不是使用类来显示预览。
如果我AVCaptureVideoDataOutput
用作输出,我可以通过从captureOutput:didOutputSampleBuffer:fromConnection:
回调中获取帧来导出它们。AVCaptureMetadataOutput
但是当用作输出时,我找不到调用 get frameBuffer 的等效方法。
有谁知道如何做到这一点?