1

I am building a prototype app on iOS, and I’m cannibalizing some Apple sample code to do it (thin ice, I know—this code uses goto statements :\ ). I am using the AVCam project from Session 520 - What's New in Camera Capture. I don’t need video capture capability, just still photos.

The device inputs and outputs are set up thusly:

    // Init the device inputs
    AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
    AVCaptureDeviceInput *newAudioInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:nil];


    // Setup the still image file output
    AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = @{AVVideoCodecKey: AVVideoCodecJPEG};
    [newStillImageOutput setOutputSettings:outputSettings];


    // Create session (use default AVCaptureSessionPresetHigh)
    AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];


    // Add inputs and output to the capture session
    if ([newCaptureSession canAddInput:newVideoInput]) {
        [newCaptureSession addInput:newVideoInput];
    }
    if ([newCaptureSession canAddInput:newAudioInput]) {
        [newCaptureSession addInput:newAudioInput];
    }
    if ([newCaptureSession canAddOutput:newStillImageOutput]) {
        [newCaptureSession addOutput:newStillImageOutput];
    }

    [self setStillImageOutput:newStillImageOutput];
    [self setVideoInput:newVideoInput];
    [self setAudioInput:newAudioInput];
    [self setSession:newCaptureSession];

And here is the method that’s called when I tap the shutter button:

- (void) captureStillImage
{
    AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
    if ([stillImageConnection isVideoOrientationSupported])
        [stillImageConnection setVideoOrientation:orientation];

    [[self stillImageOutput]
        captureStillImageAsynchronouslyFromConnection:stillImageConnection
             completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

                 ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
                     if (error)
                     {
                         if ([[self delegate] respondsToSelector:@selector(captureManager:didFailWithError:)])
                         {
                             [[self delegate] captureManager:self didFailWithError:error];
                         }
                     }
                 };

                 if (imageDataSampleBuffer != NULL)
                 {
                     NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                     ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];

                     UIImage *image = [[UIImage alloc] initWithData:imageData];

                     if ([self.delegate respondsToSelector:@selector(captureManagerCapturedImage:)])
                     {
                         dispatch_async(dispatch_get_main_queue(), ^{
                             [self.delegate captureManagerCapturedImage:image];
                         });
                     }

                     [library writeImageToSavedPhotosAlbum:[image CGImage]
                                               orientation:(ALAssetOrientation)[image imageOrientation]
                                           completionBlock:completionBlock];

                 }
                 else
                 {
                     completionBlock(nil, error);
                 }

                 if ([[self delegate] respondsToSelector:@selector(captureManagerStillImageCaptured:)])
                 {
                     [[self delegate] captureManagerStillImageCaptured:self];
                 }
             }];
}

This code successfully captures an image and saves it to the library. However, at some point while I was working on it, it changed from capturing 5-megapixel 4:3 images to capturing 1920x1080 16:9 images. I can’t find anywhere that the aspect ratio is specified, and I didn’t change any of the code relating to the configuration of the camera, capture sessions, or capture connection. Why did my camera start taking 16:9 photos?

Update: I just re-ran Apple’s original sample code, and it appears that it is also saving 16:9 images captured directly from the video. It is quite possible that I was insane before, or I took a test shot with Camera.app and was looking at that. So my real question is, how do I show a live feed from the camera on the screen while I’m shooting, and take a full-resolution photo. I can’t use UIImagePickerController, because I need to be able to overlay things on top of the live camera feed.

Update 2: I was able to solve this by throwing out the AVCapture code I was using. It turns out that UIImagePickerController does what I needed. I didn’t realize you could overlay custom controls - I thought it took over the whole screen until you were done taking a picture.

4

1 回答 1

1

If you're capturing frames from a video source, you'll end up with a resolution of 16:9. Capturing frames from a video source and taking photos are different things.

于 2012-12-13T16:25:56.013 回答