I'm trying to use CIDetector/CIFaceDetector for basic face detection, and, while it seems to recognize faces correctly, the bounds rectangle is consistently inaccurate in everything I throw at it. Here's a sample, with the bounds it detects as a green box: http://i.imgur.com/bQNaEnq.jpg
Everything seems to be just universally shifted down or mirrored by this amount. It's like the coordinates are coming from the bottom left rather than the top left. I've tried all eight CIDetectorImageOrientations on this image and they all return the same incorrect coordinates. What am I missing here? Here is the code:
NSDictionary *detectorOptions = @{ CIDetectorAccuracy : CIDetectorAccuracyHigh };
self.faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
self.features = [self.faceDetector featuresInImage:[CIImage imageWithCGImage:image.CGImage options:@{CIDetectorImageOrientation: @(1)}]];
That's really it. The image is a basic UIImage imageWithData: from the web.