编辑发现此代码有助于处理前置摄像头图像http://blog.logichigh.com/2008/06/05/uiimage-fix/
希望其他人有类似的问题,可以帮助我。还没有找到解决办法。(它可能看起来有点长,但只是一堆帮助代码)
我在从相机(正面和背面)获取的图像以及来自画廊的图像上使用 ios 面部检测器(我正在使用UIImagePicker
- 用于通过相机捕获图像和从画廊中选择图像 - 不使用 avfoundation像在 squarecam 演示中那样拍照)
我真的搞砸了检测的坐标(如果有的话),所以我写了一个简短的调试方法来获取人脸的边界以及一个在他们上面画一个正方形的实用程序,我想检查检测器的方向正在工作:
#define RECTBOX(R) [NSValue valueWithCGRect:R]
- (NSArray *)detectFaces:(UIImage *)inputimage
{
_detector = \[CIDetector detectorOfType:CIDetectorTypeFace context:nil options:\[NSDictionary dictionaryWithObject:CIDetectorAccuracyLow forKey:CIDetectorAccuracy\]\];
NSNumber *orientation = \[NSNumber numberWithInt:\[inputimage imageOrientation\]\]; // i also saw code where they add +1 to the orientation
NSDictionary *imageOptions = \[NSDictionary dictionaryWithObject:orientation forKey:CIDetectorImageOrientation\];
CIImage* ciimage = \[CIImage imageWithCGImage:inputimage.CGImage options:imageOptions\];
// try like this first
// NSArray* features = \[self.detector featuresInImage:ciimage options:imageOptions\];
// if not working go on to this (trying all orientations)
NSArray* features;
int exif;
// ios face detector. trying all of the orientations
for (exif = 1; exif <= 8 ; exif++)
{
NSNumber *orientation = \[NSNumber numberWithInt:exif\];
NSDictionary *imageOptions = \[NSDictionary dictionaryWithObject:orientation forKey:CIDetectorImageOrientation\];
NSTimeInterval start = \[NSDate timeIntervalSinceReferenceDate\];
features = \[self.detector featuresInImage:ciimage options:imageOptions\];
if (features.count > 0)
{
NSString *str = \[NSString stringWithFormat:@"found faces using exif %d",exif\];
\[faceDetection log:str\];
break;
}
NSTimeInterval duration = \[NSDate timeIntervalSinceReferenceDate\] - start;
NSLog(@"faceDetection: facedetection total runtime is %f s",duration);
}
if (features.count > 0)
{
[faceDetection log:@"-I- Found faces with ios face detector"];
for(CIFaceFeature *feature in features)
{
CGRect rect = feature.bounds;
CGRect r = CGRectMake(rect.origin.x,inputimage.size.height - rect.origin.y - rect.size.height,rect.size.width,rect.size.height);
[returnArray addObject:RECTBOX(r)];
}
return returnArray;
} else {
// no faces from iOS face detector. try OpenCV detector
}
[1]
在尝试了大量不同的图片后,我注意到人脸检测器的方向与相机图像属性不一致。我从前置摄像头拍摄了一堆照片,其中 uiimage 方向为 3(查询 imageOrientation),但人脸检测器没有为该设置找到人脸。当运行所有的 exif 可能性时,人脸检测器最终会拾取人脸,但都是针对不同的方向。
![1]: http://i.stack.imgur.com/D7bkZ.jpg
我该如何解决这个问题?我的代码有错误吗?
我遇到的另一个问题(但与面部检测器密切相关),当面部检测器拾取面部时,但是对于“错误”的方向(主要发生在前置摄像头上),UIImage
最初使用的在 uiiimageview 中正确显示,但是当我绘制了一个方形叠加层(我在我的应用程序中使用 opencv,所以我决定将其转换UIImage
为 cvmat 以使用 opencv 绘制叠加层)整个图像旋转 90 度(只有 cvmat 图像而不是UIImage
最初显示的 i)
我能想到的原因是人脸检测器弄乱了 UIimage 转换为 opencv mat 正在使用的一些缓冲区(上下文?)。我怎样才能分离这些缓冲区?
将 uiimage 转换为 cvmat 的代码是(来自UIImage
某人制作的“著名”类别):
-(cv::Mat)CVMat
{
CGColorSpaceRef colorSpace = CGImageGetColorSpace(self.CGImage);
CGFloat cols = self.size.width;
CGFloat rows = self.size.height;
cv::Mat cvMat(rows, cols, CV_8UC4); // 8 bits per component, 4 channels
CGContextRef contextRef = CGBitmapContextCreate(cvMat.data, // Pointer to backing data
cols, // Width of bitmap
rows, // Height of bitmap
8, // Bits per component
cvMat.step[0], // Bytes per row
colorSpace, // Colorspace
kCGImageAlphaNoneSkipLast |
kCGBitmapByteOrderDefault); // Bitmap info flags
CGContextDrawImage(contextRef, CGRectMake(0, 0, cols, rows), self.CGImage);
CGContextRelease(contextRef);
return cvMat;
}
- (id)initWithCVMat:(const cv::Mat&)cvMat
{
NSData *data = [NSData dataWithBytes:cvMat.data length:cvMat.elemSize() * cvMat.total()];
CGColorSpaceRef colorSpace;
if (cvMat.elemSize() == 1)
{
colorSpace = CGColorSpaceCreateDeviceGray();
}
else
{
colorSpace = CGColorSpaceCreateDeviceRGB();
}
CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGImageRef imageRef = CGImageCreate(cvMat.cols, // Width
cvMat.rows, // Height
8, // Bits per component
8 * cvMat.elemSize(), // Bits per pixel
cvMat.step[0], // Bytes per row
colorSpace, // Colorspace
kCGImageAlphaNone | kCGBitmapByteOrderDefault, // Bitmap info flags
provider, // CGDataProviderRef
NULL, // Decode
false, // Should interpolate
kCGRenderingIntentDefault); // Intent
self = [self initWithCGImage:imageRef];
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
return self;
}
-(cv::Mat)CVRgbMat
{
cv::Mat tmpimage = self.CVMat;
cv::Mat image;
cvtColor(tmpimage, image, cv::COLOR_BGRA2BGR);
return image;
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)img editingInfo:(NSDictionary *)editInfo {
self.prevImage = img;
// self.previewView.image = img;
NSArray *arr = [[faceDetection sharedFaceDetector] detectFaces:img];
for (id r in arr)
{
CGRect rect = RECTUNBOX(r);
//self.previewView.image = img;
self.previewView.image = [utils drawSquareOnImage:img square:rect];
}
[self.imgPicker dismissModalViewControllerAnimated:YES];
return;
}