我使用 AVCaptureSession 从 iPhone 的相机接收图像。它在委托函数中返回图像。在这个函数中,我创建图像并调用其他线程来处理这个图像:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
// static bool isFirstTime = true;
// if (isFirstTime == false) {
// return;
// }
// isFirstTime = false;
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//Lock the image buffer
CVPixelBufferLockBaseAddress(imageBuffer,0);
//Get information about the image
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
//Create a CGImageRef from the CVImageBufferRef
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst/*kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast*/);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
// release some components
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
UIImage* uiimage = [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationDown];
CGImageRelease(newImage);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
//[self performSelectorOnMainThread:@selector(setImageForImageView:) withObject:uiimage waitUntilDone:YES];
if(processImageThread == nil || (processImageThread != nil && processImageThread.isExecuting == false)){
[processImageThread release];
processImageThread = [[NSThread alloc] initWithTarget:self selector:@selector(processImage:) object:uiimage];
[processImageThread start];
}
[pool drain];
}
我在另一个线程上处理图像,使用 CIFilters:
- (void) processImage:(UIImage*)image{
NSLog(@"Begin process");
CIImage* ciimage = [CIImage imageWithCGImage:image.CGImage];
CIFilter* filter = [CIFilter filterWithName:@"CIColorMonochrome"];// keysAndValues:kCIInputImageKey, ciimage, "inputRadius", [NSNumber numberWithFloat:10.0f], nil];
[filter setDefaults];
[filter setValue:ciimage forKey:@"inputImage"];
[filter setValue:[CIColor colorWithRed:0.5 green:0.5 blue:1.0] forKey:@"inputColor"];
CIImage* ciResult = [filter outputImage];
CIContext* context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:ciResult fromRect:[ciResult extent]];
UIImage* uiResult = [UIImage imageWithCGImage:cgImage scale:1.0 orientation:UIImageOrientationRight];
CFRelease(cgImage);
[self performSelectorOnMainThread:@selector(setImageForImageView:) withObject:uiResult waitUntilDone:YES];
NSLog(@"End process");
}
并为图层设置结果图像:
- (void) setImageForImageView:(UIImage*)image{
self.view.layer.contents = image.CGImage;
}
但它非常滞后。我找到了一个开源的,它创建一个非常流畅的实时图像效果应用程序(也使用 AVCaptureSession。那么,这里有什么区别(我的代码和他们的代码)?如何创建实时图像效果处理应用程序?
这是开源的链接:https ://github.com/gobackspaces/DLCImagePickerController#readme