2

我正在学习 AVCaptureSession 以及如何使用其委托方法捕获多个图像

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
     didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
     fromConnection:(AVCaptureConnection *)connection

我的目标是以每秒预定义的速率捕获 1 张或多张图像。例如,每 1 秒 1 或 2 个图像。所以我设置

 AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
 captureOutput.alwaysDiscardsLateVideoFrames = YES; 
 captureOutput.minFrameDuration = CMTimeMake(1, 1);

启动时[self.captureSession startRunning];,我的日志文件显示委托每秒被调用 20 次。它来自哪里以及如何以我的预期间隔捕获图像?

4

2 回答 2

10

您可以使用下面给出的函数,如果您想以特定的时间间隔进行捕获,则设置一个计时器并再次调用该函数。

-(IBAction)captureNow
    {

    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in [stillImageOutput connections])
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] )
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) 
        {
            break;
        }
    }

    NSLog(@"About to request a capture from: %@", stillImageOutput);
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
    {

        CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
        if (exifAttachments)
        {
            // Do something with the attachments.
            NSLog(@"Attachments: %@", exifAttachments);
        }
        else
        { 
            NSLog(@"No attachments found.");
        }

        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
        UIImage *image = [[UIImage alloc] initWithData:imageData];
        [[self vImage] setImage:image];

    }];
}

有关更多参考,您可以查看iOS4:使用 AVFoundation 拍摄带有实时视频预览的照片

于 2011-11-25T05:28:29.283 回答
0

我挣扎了一段时间的事情是在拍照时出现巨大的延迟(约 5 秒),并尝试使用捕获的图像设置 UIImage。在里面

 - (void)captureOutput:(AVCaptureOutput *)captureOutput 
 didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
 fromConnection:(AVCaptureConnection *)connection

方法,你不能使用正常的功能,比如[self.image setImage:img]链接到 UI 的东西,你必须像这样在主线程上运行它们:

 [self.image performSelectorOnMainThread:@selector(setImage:) withObject:img waitUntilDone:TRUE];

希望这可以帮助某人

于 2013-01-09T10:15:01.313 回答