嗨,我将图像保存在名为 CaptureManager 的 NSObject 中。在 .h 我设置 UIImage 属性:
@property (nonatomic,strong) UIImage *captureImage;
然后在 .m 中设置图像
- (void) captureStillImage
{
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
captureImage = [[UIImage alloc] initWithData:imageData];
[[NSNotificationCenter defaultCenter] postNotificationName: @"photoTaken" object:nil userInfo:nil];
}
if ([[self delegate] respondsToSelector:@selector(captureManagerStillImageCaptured:)]) {
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
现在在我的 ViewController 中,我试图像这样显示该图像:
CaptureManager *captureManager = [[CaptureManager alloc] init];
UIImage *captureBackgroundImage = captureManager.captureImage;
UIImageView *captureImageView = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0, 320, 480)];
[captureImageView setImage:captureBackgroundImage];
[self.view addSubview:captureImageView];
[self.view bringSubviewToFront:captureImageView];
我做的一切正确吗?问题是我设置了 captureImage 吗?可能是在 captureStillImage 方法中,完成处理程序搞砸了?屏幕上什么也没有显示。任何帮助将非常感激!