0

我正在研究将前置摄像头视频输入显示到类似于 FaceTime 的 UIView 中。我知道这可以使用 AVCaptureVideoPreviewLayer 轻松完成。在不使用 AVCaptureVideoPreviewLayer 的情况下是否有另一种方法可以做到这一点?

这仅用于教育目的。

更新:我发现这可以通过 UIImagePickerController 来完成

UIImagePickerController *cameraView = [[UIImagePickerController alloc] init];
cameraView.sourceType = UIImagePickerControllerSourceTypeCamera;
cameraView.showsCameraControls = NO;
[self.view addSubview:cameraView.view];
[cameraView viewWillAppear:YES]; 
[cameraView viewDidAppear:YES];
4

1 回答 1

3

如果您正在尝试操作像素,您可以将以下方法放入您作为委托分配给 AVCaptureVideoDataOutputSampleBufferDelegate 的类中:

-(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
  CVImageBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer);

  if(CVPixelBufferLockBaseAddress(pb, 0))  //zero is success
    NSLog(@"Error");

    size_t bufferHeight = CVPixelBufferGetHeight(pb);
    size_t bufferWidth = CVPixelBufferGetWidth(pb);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pb);

    unsigned char* rowBase= CVPixelBufferGetBaseAddress(pb);


    CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
    if (colorSpace == NULL)
      NSLog(@"Error");

    // Create a bitmap graphics context with the sample buffer data.
    CGContextRef context= CGBitmapContextCreate(rowBase,bufferWidth,bufferHeight, 8,bytesPerRow, colorSpace,  kCGImageAlphaNone);

    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);

    UIImage *currentImage=[UIImage imageWithCGImage:quartzImage];

    // Free up the context and color space
    CFRelease(quartzImage);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    if(CVPixelBufferUnlockBaseAddress(pb, 0 )) //zero is success
    NSLog(@"Error");
}

然后将该图像连接到视图控制器中的 UIImageView。查看 kCGImageAlphaNone 标志。这将取决于你在做什么。

于 2013-02-15T20:57:06.917 回答