8

我想像 instagram 一样在 iPhone 上尝试一些图像过滤器功能。我使用 imagePickerController 从相机胶卷中获取照片。我了解 imagePickerController 返回的图像已减少以节省内存。将原始图像加载到 UIImage 是不明智的。但是如何处理图像然后将其保存为原始像素?我使用 iPhone 4S 作为我的开发设备。

相机胶卷中的原始照片为 3264 * 2448。

UIImagePickerControllerOriginalImage 返回的图片是 1920 * 1440

UIImagePickerControllerEditedImage 返回的图片是 640 * 640

imageViewOld(使用 UIImagePickerControllerCropRect [80,216,1280,1280] 裁剪 UIImagePickerControllerOriginalImage 返回的图像) 为 1280 * 1224

imageViewNew(使用双倍大小的 UIImagePickerControllerCropRect [80,216,2560,2560] 裁剪 UIImagePickerControllerOriginalImage 返回的图像)为 1840 * 1224。

我通过instagram检查同一张照片是1280 * 1280

我的问题是:

  1. 为什么 UIImagePickerControllerOriginalImage 不返回“原始”照片?为什么要缩小到 1920 * 1440?
  2. 为什么 UIImagePickerControllerEditedImage 不返回 1280 * 1280 的图像?正如
    UIImagePickerControllerCropRect 显示的那样,它被 1280 * 1280 正方形切割?

  3. 如何对原始照片进行方形切割以成为 2448 * 2448 图像?

提前致谢。下面是我的代码:

  - (void)imagePickerController:(UIImagePickerController *)picker  didFinishPickingMediaWithInfo:(NSDictionary *)info
  {

   NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
   if ([mediaType isEqualToString:@"public.image"])
   {

    UIImage *imageEdited = [info objectForKey:UIImagePickerControllerEditedImage];
    UIImage *imagePicked = [info objectForKey:UIImagePickerControllerOriginalImage];

    CGRect cropRect;
    cropRect = [[info valueForKey:@"UIImagePickerControllerCropRect"] CGRectValue];

    NSLog(@"Original width = %f height= %f ",imagePicked.size.width, imagePicked.size.height);
    //Original width = 1440.000000 height= 1920.000000

    NSLog(@"imageEdited width = %f height = %f",imageEdited.size.width, imageEdited.size.height);
    //imageEdited width = 640.000000 height = 640.000000

    NSLog(@"corpRect %f %f %f %f", cropRect.origin.x, cropRect.origin.y , cropRect.size.width, cropRect.size.height);
    //corpRect 80.000000 216.000000 1280.000000 1280.000000

    CGRect rectNew = CGRectMake(cropRect.origin.x, cropRect.origin.y , cropRect.size.width*2, cropRect.size.height*2);

    CGRect rectOld = CGRectMake(cropRect.origin.x, cropRect.origin.y , cropRect.size.width, cropRect.size.height);

    CGImageRef imageRefNew = CGImageCreateWithImageInRect([imagePicked CGImage], rectNew);
    CGImageRef imageRefOld = CGImageCreateWithImageInRect([imagePicked CGImage], rectOld);

    UIImageView *imageViewNew = [[UIImageView alloc] initWithImage:[UIImage imageWithCGImage:imageRefNew]];
    CGImageRelease(imageRefNew);

    UIImageView *imageViewOld = [[UIImageView alloc] initWithImage:[UIImage imageWithCGImage:imageRefOld]];
    CGImageRelease(imageRefOld);


    NSLog(@"imageViewNew width = %f height = %f",imageViewNew.image.size.width, imageViewNew.image.size.height);
    //imageViewNew width = 1840.000000 height = 1224.000000

    NSLog(@"imageViewOld width = %f height = %f",imageViewOld.image.size.width, imageViewOld.image.size.height);
    //imageViewOld width = 1280.000000 height = 1224.000000

     UIImageWriteToSavedPhotosAlbum(imageEdited, nil, nil, NULL);

     UIImageWriteToSavedPhotosAlbum([imageViewNew.image imageRotatedByDegrees:90.0], nil, nil, NULL);
     UIImageWriteToSavedPhotosAlbum([imageViewOld.image imageRotatedByDegrees:90.0], nil, nil, NULL);


    //assign the image to an UIImage Control
    self.imageV.contentMode = UIViewContentModeScaleAspectFit;
    self.imageV.frame = CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.width);
    self.imageV.image = imageEdited;


  }

  [self dismissModalViewControllerAnimated:YES];

 }
4

1 回答 1

12

As you have observed the UIImagePickerController will return a scaled down edited image sometimes 640x640 sometimes 320x320 (device dependent).

Your question:

How can I do a square cut to original photo to be a 2448 * 2448 image?

To do this you need to first use the UIImagePickerControllerCropRect to create a new image from the original image obtained using the UIImagePickerControllerOriginalImage key of the info dictionary. Using the Quartz Core method, CGImageCreateWithImageInRect you can create a new image that only contains the pixels bounded by the passed rect; in this case the crop rect. You will need to take into account orientation in order for this to work properly. Then you need only scale the image to your desired size. It's important to note that the crop rect is relative to the original image when after it has been oriented correctly, not as it comes out of the camera or photo library. This is why we need to transform the crop rect to match the orientation when we start using Quartz methods to create new images, etc.

I took your code above and set it up to create a 1280x1280 image from the original image based on the crop rect. There are still some edge cases here, i.e. taking into account that the crop rect can sometimes have negative values, (the code assumes a square cropping rect) that have not been addressed.

  1. First transform the crop rect to take into account the orientation of the incoming image and size. This transformCGRectForUIImageOrientation function is from NiftyBean
  2. Create an image that is cropped to the transformed cropping rect.
  3. Scale (and rotate) the image to the desired size. i.e. 1280x1280.
  4. Create a UIImage from the CGImage with correct scale and orientation.

Here is your code with the changes: UPDATE New code has been added below this that should take care of the missing cases.

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:@"public.image"])
{

    UIImage *imageEdited = [info objectForKey:UIImagePickerControllerEditedImage];
    UIImage *imagePicked = [info objectForKey:UIImagePickerControllerOriginalImage];

    CGRect cropRect;
    cropRect = [[info valueForKey:@"UIImagePickerControllerCropRect"] CGRectValue];

    NSLog(@"Original width = %f height= %f ",imagePicked.size.width, imagePicked.size.height);
    //Original width = 1440.000000 height= 1920.000000

    NSLog(@"imageEdited width = %f height = %f",imageEdited.size.width, imageEdited.size.height);
    //imageEdited width = 640.000000 height = 640.000000

    NSLog(@"corpRect %@", NSStringFromCGRect(cropRect));
    //corpRect 80.000000 216.000000 1280.000000 1280.000000

    CGSize finalSize = CGSizeMake(1280,1280); 
    CGImageRef imagePickedRef = imagePicked.CGImage;

    CGRect transformedRect = transformCGRectForUIImageOrientation(cropRect, imagePicked.imageOrientation, imagePicked.size);
    CGImageRef cropRectImage = CGImageCreateWithImageInRect(imagePickedRef, transformedRect);
    CGColorSpaceRef colorspace = CGImageGetColorSpace(imagePickedRef);
    CGContextRef context = CGBitmapContextCreate(NULL, 
                                                 finalSize.width, 
                                                 finalSize.height,
                                                 CGImageGetBitsPerComponent(imagePickedRef),
                                                 CGImageGetBytesPerRow(imagePickedRef),
                                                 colorspace,
                                                 CGImageGetAlphaInfo(imagePickedRef));
    CGContextSetInterpolationQuality(context, kCGInterpolationHigh); //Give the context a hint that we want high quality during the scale
    CGContextDrawImage(context, CGRectMake(0, 0, finalSize.width, finalSize.height), cropRectImage);
    CGImageRelease(cropRectImage);

    CGImageRef instaImage = CGBitmapContextCreateImage(context);
    CGContextRelease(context);

    //assign the image to an UIImage Control
    UIImage *image = [UIImage imageWithCGImage:instaImage scale:imagePicked.scale orientation:imagePicked.imageOrientation];
    self.imageView.contentMode = UIViewContentModeScaleAspectFit;
    self.imageView.image = image;
    CGImageRelease(instaImage);

    UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}

[self dismissModalViewControllerAnimated:YES];

}

CGRect transformCGRectForUIImageOrientation(CGRect source, UIImageOrientation orientation, CGSize imageSize) {
switch (orientation) {
    case UIImageOrientationLeft: { // EXIF #8
        CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
        CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI_2);
        return CGRectApplyAffineTransform(source, txCompound);
    }
    case UIImageOrientationDown: { // EXIF #3
        CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
        CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI);
        return CGRectApplyAffineTransform(source, txCompound);
    }
    case UIImageOrientationRight: { // EXIF #6
        CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(0.0, imageSize.width);
        CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI + M_PI_2);
        return CGRectApplyAffineTransform(source, txCompound);
    }
    case UIImageOrientationUp: // EXIF #1 - do nothing
    default: // EXIF 2,4,5,7 - ignore
        return source;
}
}

UPDATE I have made a couple of methods that will take care of the rest of the cases. The steps are basically the same, with a couple of modifications.

  1. First modification is to correctly transform and scale the context to handle the orientation of the incoming image,
  2. and the second is to support the non square crops you can get from the UIImagePickerController. In these cases the square image is filled with a color of your choosing.

New Code

// CropRect is assumed to be in UIImageOrientationUp, as it is delivered this way from the UIImagePickerController when using AllowsImageEditing is on.
// The sourceImage can be in any orientation, the crop will be transformed to match
// The output image bounds define the final size of the image, the image will be scaled to fit,(AspectFit) the bounds, the fill color will be
// used for areas that are not covered by the scaled image. 
-(UIImage *)cropImage:(UIImage *)sourceImage cropRect:(CGRect)cropRect aspectFitBounds:(CGSize)finalImageSize fillColor:(UIColor *)fillColor {

CGImageRef sourceImageRef = sourceImage.CGImage;

//Since the crop rect is in UIImageOrientationUp we need to transform it to match the source image.
CGAffineTransform rectTransform = [self transformSize:sourceImage.size orientation:sourceImage.imageOrientation];
CGRect transformedRect = CGRectApplyAffineTransform(cropRect, rectTransform);

//Now we get just the region of the source image that we are interested in.
CGImageRef cropRectImage = CGImageCreateWithImageInRect(sourceImageRef, transformedRect);

//Figure out which dimension fits within our final size and calculate the aspect correct rect that will fit in our new bounds
CGFloat horizontalRatio = finalImageSize.width / CGImageGetWidth(cropRectImage);
CGFloat verticalRatio = finalImageSize.height / CGImageGetHeight(cropRectImage);
CGFloat ratio = MIN(horizontalRatio, verticalRatio); //Aspect Fit
CGSize aspectFitSize = CGSizeMake(CGImageGetWidth(cropRectImage) * ratio, CGImageGetHeight(cropRectImage) * ratio);


CGContextRef context = CGBitmapContextCreate(NULL,
                                             finalImageSize.width,
                                             finalImageSize.height,
                                             CGImageGetBitsPerComponent(cropRectImage),
                                             0,
                                             CGImageGetColorSpace(cropRectImage),
                                             CGImageGetBitmapInfo(cropRectImage));

if (context == NULL) {
    NSLog(@"NULL CONTEXT!");
}

//Fill with our background color
CGContextSetFillColorWithColor(context, fillColor.CGColor);
CGContextFillRect(context, CGRectMake(0, 0, finalImageSize.width, finalImageSize.height));

//We need to rotate and transform the context based on the orientation of the source image.
CGAffineTransform contextTransform = [self transformSize:finalImageSize orientation:sourceImage.imageOrientation];
CGContextConcatCTM(context, contextTransform);

//Give the context a hint that we want high quality during the scale
CGContextSetInterpolationQuality(context, kCGInterpolationHigh); 

//Draw our image centered vertically and horizontally in our context.
CGContextDrawImage(context, CGRectMake((finalImageSize.width-aspectFitSize.width)/2, (finalImageSize.height-aspectFitSize.height)/2, aspectFitSize.width, aspectFitSize.height), cropRectImage);

//Start cleaning up..
CGImageRelease(cropRectImage);

CGImageRef finalImageRef = CGBitmapContextCreateImage(context);
UIImage *finalImage = [UIImage imageWithCGImage:finalImageRef];

CGContextRelease(context);
CGImageRelease(finalImageRef);
return finalImage;
}

//Creates a transform that will correctly rotate and translate for the passed orientation.
//Based on code from niftyBean.com
- (CGAffineTransform) transformSize:(CGSize)imageSize orientation:(UIImageOrientation)orientation {

CGAffineTransform transform = CGAffineTransformIdentity;
switch (orientation) {
    case UIImageOrientationLeft: { // EXIF #8
        CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.height, 0.0);
        CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI_2);
        transform = txCompound;
        break;
    }
    case UIImageOrientationDown: { // EXIF #3
        CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(imageSize.width, imageSize.height);
        CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,M_PI);
        transform = txCompound;
        break;
    }
    case UIImageOrientationRight: { // EXIF #6
        CGAffineTransform txTranslate = CGAffineTransformMakeTranslation(0.0, imageSize.width);
        CGAffineTransform txCompound = CGAffineTransformRotate(txTranslate,-M_PI_2);
        transform = txCompound;
        break;
    }
    case UIImageOrientationUp: // EXIF #1 - do nothing
    default: // EXIF 2,4,5,7 - ignore
        break;
}
return transform;

}
于 2012-06-28T02:40:42.477 回答