I need to crop out a face/multiple faces from a given image and use the cropped face image for other use. I am using CIDetectorTypeFace from CoreImage. The problem is the new UIImage that contains just the detected face needs to be bigger in size as the hair is cut-off or the lower jaw is cut-off. How do i increase the size of the initWithFrame:faceFeature.bounds
??
Sample code i am using:
CIImage* image = [CIImage imageWithCGImage:staticBG.image.CGImage];
CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
NSArray* features = [detector featuresInImage:image];
for(CIFaceFeature* faceFeature in features)
{
UIView* faceView = [[UIView alloc] initWithFrame:faceFeature.bounds];
faceView.layer.borderWidth = 1;
faceView.layer.borderColor = [[UIColor redColor] CGColor];
[staticBG addSubview:faceView];
// cropping the face
CGImageRef imageRef = CGImageCreateWithImageInRect([staticBG.image CGImage], faceFeature.bounds);
[resultView setImage:[UIImage imageWithCGImage:imageRef]];
CGImageRelease(imageRef);
}
Note: The red frame that i made to show the detected face region does-not at all match with the cropped out image. Maybe i am not displaying the frame right but since i do not need to show the frame, i really need the cropped out face, i am not worrying about it much.