6

我想从 AVCaptureSession 的实时提要中提取帧,并且我使用 Apple 的 AVCam 作为测试用例。这是 AVCam 的链接:

https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html

我发现那captureOutput:didOutputSampleBuffer:fromConnection没有被调用,我想知道为什么或我做错了什么。

这是我所做的:

(1)我做AVCamViewController一个代表

@interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate>

(2) 我创建了一个AVCaptureVideoDataOutput对象并将其添加到会话中

AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
if ([session canAddOutput:videoDataOutput])
     {
         [session addOutput:videoDataOutput];
     }

(3) 我添加了委托方法并通过记录一个随机字符串进行测试

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"I am called");

}

测试应用程序有效,但未调用 captureOutput:didOutputSampleBuffer:fromConnection。

(4) 我在 SO 上读到,AVCaptureSession *session = [[AVCaptureSession alloc] init];viewDidLoad 中的会话变量是未调用委托的可能原因,我将其设为 AVCamViewController 类的实例变量,但未调用它。

这是我正在测试的 viewDidLoad 方法(取自 AVCam),我在方法末尾添加了 AVCaptureDataOutput:

- (void)viewDidLoad
{
    [super viewDidLoad];

    // Create the AVCaptureSession
    session = [[AVCaptureSession alloc] init];
    [self setSession:session];

    // Setup the preview view
    [[self previewView] setSession:session];

    // Check for device authorization
    [self checkDeviceAuthorizationStatus];

    // In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.
    // Why not do all of this on the main queue?
    // -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive).

    dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
    [self setSessionQueue:sessionQueue];

    dispatch_async(sessionQueue, ^{
        [self setBackgroundRecordingID:UIBackgroundTaskInvalid];

        NSError *error = nil;

        AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
        AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];

        if (error)
        {
            NSLog(@"%@", error);
        }

        if ([session canAddInput:videoDeviceInput])
        {
            [session addInput:videoDeviceInput];
            [self setVideoDeviceInput:videoDeviceInput];

            dispatch_async(dispatch_get_main_queue(), ^{
                // Why are we dispatching this to the main queue?
                // Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
                // Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.

                [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]];
            });
        }

        AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
        AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];

        if (error)
        {
            NSLog(@"%@", error);
        }

        if ([session canAddInput:audioDeviceInput])
        {
            [session addInput:audioDeviceInput];
        }

        AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
        if ([session canAddOutput:movieFileOutput])
        {
            [session addOutput:movieFileOutput];
            AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
            if ([connection isVideoStabilizationSupported])
                [connection setEnablesVideoStabilizationWhenAvailable:YES];
            [self setMovieFileOutput:movieFileOutput];
        }

        AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
        if ([session canAddOutput:stillImageOutput])
        {
            [stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}];
            [session addOutput:stillImageOutput];
            [self setStillImageOutput:stillImageOutput];
        }

        AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
        [videoDataOutput setSampleBufferDelegate:self queue:sessionQueue];

        if ([session canAddOutput:videoDataOutput])
        {
            NSLog(@"Yes I can add it");
            [session addOutput:videoDataOutput];
        }

    });
}

- (void)viewWillAppear:(BOOL)animated
{
    dispatch_async([self sessionQueue], ^{
        [self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];
        [self addObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:CapturingStillImageContext];
        [self addObserver:self forKeyPath:@"movieFileOutput.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext];
        [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];

        __weak AVCamViewController *weakSelf = self;
        [self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {
            AVCamViewController *strongSelf = weakSelf;
            dispatch_async([strongSelf sessionQueue], ^{
                // Manually restarting the session since it must have been stopped due to an error.
                [[strongSelf session] startRunning];
                [[strongSelf recordButton] setTitle:NSLocalizedString(@"Record", @"Recording button record title") forState:UIControlStateNormal];
            });
        }]];
        [[self session] startRunning];
    });
}

有人可以告诉我为什么以及如何解决它的建议吗?

4

3 回答 3

8

我已经对此进行了很多实验,我想我可能已经有了答案。我有类似但不同的代码,它们是从头开始编写的,而不是从 Apple 的示例中复制而来的(现在有点老了)。

我觉得是分区...

AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
    if ([session canAddOutput:movieFileOutput])
    {
        [session addOutput:movieFileOutput];
        AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
        if ([connection isVideoStabilizationSupported])
            [connection setEnablesVideoStabilizationWhenAvailable:YES];
        [self setMovieFileOutput:movieFileOutput];
    }

根据我的实验,这是导致您出现问题的原因。在我的代码中,captureOutput:didOutputSampleBuffer:fromConnection没有调用。我认为视频系统要么为您提供一系列样本缓冲区,要么将压缩、优化的电影文件记录到磁盘,而不是两者兼而有之。(至少在 iOS 上。)我想这是有道理的/并不奇怪,但我没有在任何地方看到它记录在案!

此外,在某一时刻,当我打开麦克风时,我似乎遇到了错误和/或缓冲区回调没有发生。再次未记录,这些是错误 -11800(未知错误)。但我不能总是重现那个。

于 2015-07-24T16:06:33.453 回答
1

你的代码对我来说看起来不错,我可以想出 10 种你可以尝试的猜测和检查方法,所以我将采用一种不同的方法,希望能间接解决这个问题。除了我认为 AVCam 写得不好之外,我认为你最好看一个只关注实时视频而不是录制视频和拍摄静止图像的示例。我已经提供了一个示例,仅此而已。

-(void)startSession {
    self.session = [AVCaptureSession new];
    self.session.sessionPreset = AVCaptureSessionPresetMedium;
    AVCaptureDevice *backCamera;
    for (AVCaptureDevice *device in [AVCaptureDevice devices]) {
        if ([device hasMediaType:AVMediaTypeVideo] && device.position == AVCaptureDevicePositionBack) {
            backCamera = device;
            break;
        }
    }
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
    if (error) {
        // handle error
    }
    if ([self.session canAddInput:input]) {
        [self.session addInput:input];
    }
    AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];
    [output setSampleBufferDelegate:self queue:self.queue];
    output.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)};
    if ([self.session canAddOutput:output]) {
        [self.session addOutput:output];
    }
    dispatch_async(self.queue, ^{
        [self.session startRunning];
    });
}
于 2014-08-03T23:38:42.520 回答
1

当我在 React-Native 和本机 iOS/Swif/ObjectiveC 之间架起一座桥梁时,我遇到了同样的问题。

然后我发现了2个类似的问题。@Carl 的回答似乎确实是正确的。然后我找到了其他问题的答案:

我联系了 Apple 支持的工程师,他告诉我不支持同时使用 AVCaptureVideoDataOutput + AVCaptureMovieFileOutput。不知道他们以后会不会支持,但是他用了“暂时不支持”这个词。

我鼓励您像我一样填写错误报告/功能请求(bugreport.apple.com),因为它们衡量人们想要某样东西的难易程度,我们也许可以在不久的将来看到这一点。

同时 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput

于 2018-05-21T17:03:48.137 回答