我正在使用 AVFramework 捕获相机帧,我想在 UIImageView 中处理和显示它们,但遇到了一些麻烦。我有代码:
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
//NSLog(@"Capturing\n");
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
NSLog(@"Image: %f %f\n", image.size.height, image.size.width);
[imageView setImage:image];
}
但是,它不会显示。正确的尺寸显示在NSLog
, 当我放:
[imageView setImage:[UIImage imageNamed:@"SomethingElse.png"]];
在 viewDidLoad 中,图像显示正确(所以我知道 UIImageView 连接正确)。
有什么理由这不应该工作???我现在不知所措。
干杯,布雷特