0

I'm trying to use CIDetector/CIFaceDetector for basic face detection, and, while it seems to recognize faces correctly, the bounds rectangle is consistently inaccurate in everything I throw at it. Here's a sample, with the bounds it detects as a green box: http://i.imgur.com/bQNaEnq.jpg

Everything seems to be just universally shifted down or mirrored by this amount. It's like the coordinates are coming from the bottom left rather than the top left. I've tried all eight CIDetectorImageOrientations on this image and they all return the same incorrect coordinates. What am I missing here? Here is the code:

NSDictionary *detectorOptions = @{ CIDetectorAccuracy : CIDetectorAccuracyHigh };
self.faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];
self.features = [self.faceDetector featuresInImage:[CIImage imageWithCGImage:image.CGImage options:@{CIDetectorImageOrientation: @(1)}]];

That's really it. The image is a basic UIImage imageWithData: from the web.

4

1 回答 1

1

您正在使用 UIKit 和 Core Image 工作。这些框架中的每一个都使用不同的协调系统。

  • UIKit 协调系统从左上角开始
  • 核心图像坐标从左下角开始

您可能正在将带有 Core Image 坐标的绿色矩形绘制到 UIKit 上下文中。您的代码可以正常工作,您只需要转换坐标。

您还可以在iOS 开发库中找到对它的引用

有关如何在这两个系统之间进行转换的非常简洁的方法,请参阅CoreImage 和 UIKit 协调博客文章。

于 2014-09-13T20:17:15.917 回答