4

我尝试CMSampleBufferRef从两者中获取AVCaptureVideoDataOutputAVCaptureAudioDataOutput

AVCamRecorder.h

#import <AVFoundation/AVFoundation.h>

@interface AVCamRecorder : NSObject {
}
    @property (nonatomic,retain) AVCaptureVideoDataOutput *videoDataOutput;
    @property (nonatomic,retain) AVCaptureAudioDataOutput *audioDataOutput;

@end

AVCamRecorder.m

#import "AVCamRecorder.h"
#import <AVFoundation/AVFoundation.h>

@interface AVCamRecorder (VideoDataOutputDelegate) <AVCaptureVideoDataOutputSampleBufferDelegate>
@end
@interface AVCamRecorder (AudioDataOutputDelegate) <AVCaptureAudioDataOutputSampleBufferDelegate>
@end


-(id)initWithSession:(AVCaptureSession *)aSession
{

    self = [super init];
    if (self != nil) {

        //AudioDataoutput
        AVCaptureAudioDataOutput *aAudioDataOutput =  [[AVCaptureAudioDataOutput alloc] init];

        //VideoDataoutput
        AVCaptureVideoDataOutput *aMovieDataOutput = [[AVCaptureVideoDataOutput alloc] init];


        if ([aSession canAddOutput:aAudioDataOutput]) {
            [aSession addOutput:aAudioDataOutput];
        }        
        if ([aSession canAddOutput:aMovieDataOutput]) {
        [aSession addOutput:aMovieDataOutput];
        }

        [aAudioDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
        [aMovieDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

        [self setAudioDataOutput:aAudioDataOutput];
        [self setVideoDataOutput:aMovieDataOutput];

        [self setSession:aSession];

    }
    return self;
}

@implementation AVCamRecorder (VideoDataOutputDelegate)
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"VideoDataOutputDelegate = %@", captureOutput);
}    
@end

@implementation AVCamRecorder (AudioDataOutputDelegate)
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"AudioDataOutputDelegate = %@", captureOutput);
}
@end

奇怪的是,我在“ @implementation AVCamRecorder (AudioDataOutputDelegate)”中得到了视频数据

AudioDataOutputDelegate = <AVCaptureVideoDataOutput: 0x208a7df0>

我调换了“ @implementation AVCamRecorder (VideoDataOutputDelegate)”和“ @implementation AVCamRecorder (VideoDataOutputDelegate)”的顺序,得到了

VideoDataOutputDelegate = <AVCaptureVideoDataOutput: 0x208a7df0>

看来我无法设置 2“ captureOutput:didOutputSampleBuffer:fromConnection:”。否则,数据进入其中之一。

或者,我是否错误地设置了“ @implementation AVCamRecorder (VideoDataOutputDelegate)”和“ @implementation AVCamRecorder (AudioDataOutputDelegate)”?

我想我不需要单独的回调,但我只是想知道出了什么问题。

提前谢谢你的帮助。

4

1 回答 1

1

您在同一类上定义了 2 个类别

AVCamRecorder (VideoDataOutputDelegate)
AVCamRecorder (AudioDataOutputDelegate)

声明相同的方法

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection;

这会导致未定义的行为。请参阅“使用 Objective-C 编程”指南中的避免类别方法名称冲突:

如果在一个类别中声明的方法的名称与原始类中的方法相同,或者与同一类(甚至超类)上的另一个类别中的方法相同,则行为未定义至使用哪种方法实现运行。
...

所以你的设置无法工作。你可以改为

  • 定义两个单独的,一个作为音频,一个作为视频代理,
  • 定义一个单独的类类别作为音频 + 视频委托(并检查调用它的函数的回调方法),
  • 只需将AVCamRecorder自己用作音频+视频代表。
于 2012-12-31T17:52:55.583 回答