4

我让我的 AVCaptureSession 工作,它几乎完美地复制了 Camera.app UI,但是,几秒钟后,应用程序将崩溃,我只是找不到我做错了什么。我真的希望有人知道如何优化它!

正在使用 ARC;再一次,整个会话运行良好,但一点点后崩溃。AVCaptureSession 委托方法每秒钟都被调用一次。如果只有在用户按下“拍照”按钮时才能调用该方法,我该如何在保持“实时”预览层的同时做到这一点?

提前致谢!

设置会话

NSError *error = nil;
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];

output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];

dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
if(version >= 4.0 && version < 5.0) {
    output.minFrameDuration = CMTimeMake(1, 15);
}
output.alwaysDiscardsLateVideoFrames = YES;

previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:previewLayer];
[self.view addSubview:camera_overlay];
[session startRunning];

被调用的 AVCaptureSession 委托:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    UIImage *capture_image = [self imageFromSampleBuffer:sampleBuffer];
    return capture_image;
}

从样本缓冲区获取 UIImage 的方法

- (UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);
    UIImage *image = [UIImage imageWithCGImage:quartzImage];
    CGImageRelease(quartzImage);

    return image;

}
4

2 回答 2

5

查看 Apple 的AVCam Demo应用程序以获取完整示例。

方法

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
fromConnection:(AVCaptureConnection *)connection {

每次准备好相机帧时都会调用它,在您的情况下,它每秒调用 15 次,或者至少应该调用 15 次,因为您将帧速率指定为output.minFrameDuration = CMTimeMake(1, 15);

从您提供的代码中,我能想到的唯一原因是您没有发布UIImage *capture_image

您可以使用 XCode Instruments 来分析您的应用程序并了解发生这种情况的原因:Instruments Guide

Leaks 工具是您的第一站,网上有很多教程,这里有一个:Tracking iPhone Memory Leaks这是由 SO 用户 OwenGross 编写的,如果我没记错的话取自这里

于 2011-11-27T05:12:17.623 回答
0

帖子看起来很旧,但如果有人看到这个:

您在委托方法中将图片返回给谁(

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    UIImage *capture_image = [self imageFromSampleBuffer:sampleBuffer];
    return capture_image;
}

)?

您可以使用引发标志的按钮,在委托方法中检查是否引发了标志,然后才创建图像。图像应该是一个实例变量,否则无论如何它都会丢失。

还有一些委托方法来捕获图像 captureStillImageAsynchronouslyFromConnection

于 2012-08-28T14:30:55.610 回答