1

我的目标是将 iDevice 的屏幕镜像到 OSX,尽可能无延迟。

据我所知,有两种方法:

  1. Airplay Mirroring(例如反射器)
  2. CoreMediaIO通过 Lightning(例如 Quicktime Recording)

我选择了第二种方法,因为(据我所知)连接的 iDevice 可以在一次性设置后自动识别为 DAL 设备。

如何做到这一点的主要资源是这个博客:https ://nadavrub.wordpress.com/2015/07/06/macos-media-capture-using-coremediaio/

该博客非常深入地介绍了如何使用CoreMediaIO,但是AVFoundation一旦您将连接的 iDevice 识别为AVCaptureDevice.

这个问题:如何通过USB镜像iOS屏幕?已经发布了一个关于如何抓取 iDevice 提供的 H264(附件 B)muxxed 数据流的每一帧的解决方案。

但是,我的问题是VideoToolbox无法正确解码(错误代码 -8969,BadData),即使代码不应该有任何差异。

vtDecompressionDuctDecodeSingleFrame 在 /SourceCache/CoreMedia_frameworks/CoreMedia-1562.240/Sources/VideoToolbox/VTDecompressionSession.c 第 3241 行发出 err=-8969 (err)(VTVideoDecoderDecodeFrame 返回错误)

完整代码:

#import "ViewController.h"

@import CoreMediaIO;
@import AVFoundation;
@import AppKit;

@implementation ViewController

AVCaptureSession *session;
AVCaptureDeviceInput *newVideoDeviceInput;
AVCaptureVideoDataOutput *videoDataOutput;

- (void)viewDidLoad {
    [super viewDidLoad];
}

- (instancetype)initWithCoder:(NSCoder *)coder
{
    self = [super initWithCoder:coder];
    if (self) {
        // Allow iOS Devices Discovery
        CMIOObjectPropertyAddress prop =
        { kCMIOHardwarePropertyAllowScreenCaptureDevices,
            kCMIOObjectPropertyScopeGlobal,
            kCMIOObjectPropertyElementMaster };
        UInt32 allow = 1;
        CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
                                  &prop, 0, NULL,
                                  sizeof(allow), &allow );

        // Get devices
        NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeMuxed];
        BOOL deviceAttahced = false;
        for (int i = 0; i < [devices count]; i++) {
            AVCaptureDevice *device = devices[i];
            if ([[device uniqueID] isEqualToString:@"b48defcadf92f300baf5821923f7b3e2e9fb3947"]) {
                deviceAttahced = true;
                [self startSession:device];
                break;
            }
        }

    }
    return self;
}

- (void) deviceConnected:(AVCaptureDevice *)device {
    if ([[device uniqueID] isEqualToString:@"b48defcadf92f300baf5821923f7b3e2e9fb3947"]) {
        [self startSession:device];
    }
}

- (void) startSession:(AVCaptureDevice *)device {

    // Init capturing session
    session = [[AVCaptureSession alloc] init];

    // Star session configuration
    [session beginConfiguration];

    // Add session input
    NSError *error;
    newVideoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
        if (newVideoDeviceInput == nil) {
        dispatch_async(dispatch_get_main_queue(), ^(void) {
            NSLog(@"%@", error);
        });
    } else {
        [session addInput:newVideoDeviceInput];
    }

    // Add session output
    videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
    videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (id)kCVPixelBufferPixelFormatTypeKey];

    dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);

    [videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
    [session addOutput:videoDataOutput];

    // Finish session configuration
    [session commitConfiguration];

    // Start the session
    [session startRunning];
}

#pragma mark - AVCaptureAudioDataOutputSampleBufferDelegate

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    //NSImage *resultNSImage = [self imageFromSampleBuffer:sampleBuffer];

    //self.imageView.image = [self nsImageFromSampleBuffer:sampleBuffer];
    self.imageView.image = [[NSImage alloc] initWithData:imageToBuffer(sampleBuffer)];
}    

NSData* imageToBuffer( CMSampleBufferRef source) {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

    NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    return data;
}  
4

1 回答 1

1

不可以,您必须删除附件 b 起始代码并将其替换为尺寸值。与 MP4 格式相同

于 2016-08-09T17:24:07.120 回答