2

我正在使用 AVFoundation 访问图像和音频以制作视频。问题是当我为音频添加设备时。

AVCaptureDevice *audioDevice     = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeAudio];
AVCaptureDeviceInput * microphone_input = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
AVCaptureAudioDataOutput * audio_output = [[AVCaptureAudioDataOutput alloc] init];
[self.captureSession2 addInput:microphone_input];
[self.captureSession2 addOutput:audio_output];
dispatch_queue_t queue2;
queue2 = dispatch_queue_create("Audio", NULL);
[audio_output setSampleBufferDelegate:self queue:queue2];
dispatch_release(queue2);

和相机的图像。

AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

//putting it on the input.
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:nil];

//selecting the Output. 
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];

[self.captureSession addInput:captureInput];
[self.captureSession addOutput:captureOutput];
dispatch_queue_t    queue;
queue = dispatch_queue_create("cameraQueue", 0);
[captureOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

毕竟通过代表获得原始数据

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
   fromConnection:(AVCaptureConnection *)connection 
{   
if ([captureOutput isKindOfClass:[AVCaptureAudioDataOutput class]]) 
    [self sendAudeoRaw:sampleBuffer];
if ([captureOutput isKindOfClass:[AVCaptureVideoDataOutput class]]) 
    [self sendVideoRaw:sampleBuffer];}

获取图像原始数据的速度非常慢,大约每秒 2 张图像。我该如何改进它,因为我每秒查看大约 10-12 张图像。请帮忙

4

1 回答 1

0

Do these four things to start:

Create a global queue, and don't release it until you deallocate the encapsulating object; specify 'serial' as the type of queue, and make the target the main queue:

_captureOutputQueue  = dispatch_queue_create_with_target("bush.alan.james.PhotosRemote.captureOutputQueue", DISPATCH_QUEUE_SERIAL, dispatch_get_main_queue());

Get the media type description from each sample buffer to determine whether the sample buffer contains audio or video data:

CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
CMMediaType mediaType = CMFormatDescriptionGetMediaType(formatDescription);
if (mediaType == kCMMediaType_Audio)...
if (mediaType == kCMMediaType_Video)...

Instead of sending the sample buffers to another class by a method call, make the other class the data output delegate; you're doubling the work, otherwise.

Finally, make sure you're running the AVSession in a queue of its own. Per Apple's documentation for AVCaptureSession:

The startRunning method is a blocking call which can take some time, therefore you should perform session setup on a serial queue so that the main queue isn't blocked (which keeps the UI responsive). See AVCam-iOS: Using AVFoundation to Capture Images and Movies for an implementation example.

That includes any calls made to methods that configure the camera and, in particular, any that call the startRunning or stopRunning methods of AVCaptureSession:

dispatch_async(self.sessionQueue, ^{
    [self configureSession];
});

dispatch_async(self.sessionQueue, ^{
    [self.session startRunning];
});

dispatch_async(self.sessionQueue, ^{
    [self.session stopRunning];
});

If you cannot set the delegate as the class that processes the sample buffers, you may consider putting them on a queue to which both classes have access, and then passing a key:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    static char kMyKey; // set key to any value; pass the key--not the sample buffer--to the receiver
    dispatch_queue_set_specific(((AppDelegate *)[[UIApplication sharedApplication] delegate].serialQueue,
                                &kMyKey,
                                (void*)CFRetain(sampleBuffer),
                                (dispatch_function_t)CFRelease); 
    });      
}

In the receiver class:

dispatch_async(((AppDelegate *)[[UIApplication sharedApplication] delegate]).serialQueue, ^{
            CMSampleBufferRef sb = dispatch_get_specific(&kMyKey);
            NSLog(@"sb: %i", CMSampleBufferIsValid(sb));
});
于 2018-01-21T05:10:57.687 回答