0

我正在为 iOS 构建一个本机扩展,我想在其中实现条形码扫描仪。

我遵循了AVCam示例,并在本机应用程序(完整的 xcode)中尝试过它,它工作正常。

现在,我想在 Flex 移动项目中使用此代码。我已经能够创建 ANE 并将其放在 Flex Mobile 项目中,并且可以调用 ANE 的功能。

它似乎工作正常,但我的问题是我看不到你通过相机看到的东西。我的意思是,我有一个方法可以调用来启动相机并初始化捕获。我还实现了 captureOutput 委托,最奇怪的是,当我运行我的应用程序时,我可以看到 initcapture 和 captureOutput 中的日志,就像应用程序正在捕获数据一样,但在 iPad 中我看不到相机。

这是我使用的代码的一部分:

- (void)initCapture
{
    NSLog(@"camera view capture init");
    /*We setup the input*/
    self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
    /*We setupt the output*/
    captureOutput = [[AVCaptureVideoDataOutput alloc] init];
    // If the queue is blocked when new frames are captured, those frames will be automatically dropped
    captureOutput.alwaysDiscardsLateVideoFrames = YES;
    //captureOutput.minFrameDuration = CMTimeMake(1, 10); Uncomment it to specify a minimum duration for each video frame
    [captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
    // Set the video output to store frame in BGRA (It is supposed to be faster)

    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
    // Set the video output to store frame in 422YpCbCr8(It is supposed to be faster)

    //************************Note this line
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];

    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
    [captureOutput setVideoSettings:videoSettings];

    //And we create a capture session
    self.captureSession = [[AVCaptureSession alloc] init];
    //We add input and output
    [self.captureSession addInput:captureInput];
    [self.captureSession addOutput:captureOutput];


    if ([self.captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720])
    {
        NSLog(@"camera view Set preview port to 1280X720");
        self.captureSession.sessionPreset = AVCaptureSessionPreset1280x720;
    } else
        //set to 640x480 if 1280x720 not supported on device
        if ([self.captureSession canSetSessionPreset:AVCaptureSessionPreset640x480])
        {
            NSLog(@"camera view Set preview port to 640X480");
            self.captureSession.sessionPreset = AVCaptureSessionPreset640x480;
        }


    /*We add the preview layer*/

    self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];

    if ([self.prevLayer respondsToSelector:@selector(connection)])
        self.prevLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
    else
        self.prevLayer.orientation = AVCaptureVideoOrientationLandscapeLeft;

    self.prevLayer.frame = CGRectMake(150, 0, 700, 700);
    self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspect;
    [self.view.layer addSublayer: self.prevLayer];
}

- (void) startScanning {
    NSLog(@"camera view start scanning");
    self.state = LAUNCHING_CAMERA;
    [self.captureSession startRunning];
    self.prevLayer.hidden = NO;
    self.state = CAMERA;
}

#pragma mark AVCaptureSession delegate

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"camera view Capture output");
}

我应该如何解决这个问题?

非常感谢你。

4

1 回答 1

1

我想我已经解决了。

代替:

[self.view.layer addSublayer: self.prevLayer];

我放:

UIViewController *mainController = [UIApplication sharedApplication].keyWindow.rootViewController;
[mainController.view.layer addSublayer: self.prevLayer];

现在,我可以在我的 flex 应用程序上看到相机了。

于 2013-02-13T12:15:42.050 回答