1

I am using the captureOutput:didOutputSampleBuffer:fromConnection: delegate method of AVCaptureVideoDataOutput. When testing it on the iPad the image buffer size is always 360x480 which seems really strange, I would think it would be the size of the iPad screen.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { 

    @autoreleasepool {

        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
        /*Lock the image buffer*/
        CVPixelBufferLockBaseAddress(imageBuffer,0); 
        /*Get information about the image*/
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
        size_t width = CVPixelBufferGetWidth(imageBuffer); 
        size_t height = CVPixelBufferGetHeight(imageBuffer);  

        /*Create a CGImageRef from the CVImageBufferRef*/
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        CGImageRef newImage = CGBitmapContextCreateImage(newContext);

        NSLog(@"image size: h %zu, w %zu", height, width);

        /*We unlock the  image buffer*/
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);

        CGRect zoom = CGRectMake(self.touchPoint.y, self.touchPoint.x, 120, 120);
        CGImageRef newImage2 = CGImageCreateWithImageInRect(newImage, zoom);

        /*We release some components*/
        CGContextRelease(newContext); 
        CGColorSpaceRelease(colorSpace);

        UIImage* zoomedImage = [[UIImage alloc] initWithCGImage:newImage2 scale:1.0 orientation:UIImageOrientationUp];
        [self.zoomedView.layer performSelectorOnMainThread:@selector(setContents:) withObject:(__bridge id)zoomedImage.CGImage waitUntilDone:YES];

        CGImageRelease(newImage);
        CGImageRelease(newImage2);

    }

}//end

Is there a reason why the image buffer would be so small, even on iPad?

4

1 回答 1

2

The quality of the AVCaptureSession is determined by the sessionPreset property, which defaults to AVCaptureSessionPresetHigh. It doesn't care what the resolution of the screen on the capturing device is; the capture quality is a function of the device's camera.

If you want the capture resolution to more closely match the screen resolution, you'll have to change the sessionPreset. Just note that none of the presets correspond directly to any screen resolution, rather they correspond to common video formats, like VGA, 720p, 1080p, etc:

NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPreset1920x1080;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPresetiFrame1280x720;
于 2012-07-11T23:16:25.287 回答