1

我发现自己有几个 NSImage 对象需要旋转 90 度,将一种颜色的像素颜色更改为另一种颜色,然后将 RGB565 数据表示为 NSData 对象。

我在 Accelerate 框架中找到了 vImageConvert_ARGB8888toRGB565 函数,所以它应该能够进行 RGB565 输出。

我在 StackOverflow 上找到了一些 UIImage 旋转,但我无法将它们转换为 NSImage,因为看起来我必须使用 NSGraphicsContext 而不是 CGContextRef?

理想情况下,我希望这些在 NSImage 类别中,所以我可以打电话。

NSImage *rotated = [inputImage rotateByDegrees:90];
NSImage *colored = [rotated changeColorFrom:[NSColor redColor] toColor:[NSColor blackColor]];
NSData *rgb565 = [colored rgb565Data];

我只是不知道从哪里开始,因为图像处理对我来说是新的。

我很感激我能得到的任何帮助。

编辑 (22/04/2013)

我设法将这段代码拼凑在一起以生成 RGB565 数据,它会颠倒生成它并带有一些小伪像,我认为第一个是由于使用了不同的坐标系,第二个可能是由于我从 PNG 到 BMP。我将使用 BMP 和非透明 PNG 进行更多测试。

- (NSData *)RGB565Data
{
    CGContextRef cgctx = CreateARGBBitmapContext(self.CGImage);
    if (cgctx == NULL)
        return nil;

    size_t w = CGImageGetWidth(self.CGImage);
    size_t h = CGImageGetHeight(self.CGImage);
    CGRect rect = {{0,0},{w,h}};
    CGContextDrawImage(cgctx, rect, self.CGImage);

    void *data = CGBitmapContextGetData (cgctx);
    CGContextRelease(cgctx);

    if (!data)
        return nil;

    vImage_Buffer src;
    src.data = data;
    src.width = w;
    src.height = h;
    src.rowBytes = (w * 4);

    void* destData = malloc((w * 2) * h);

    vImage_Buffer dst;
    dst.data = destData;
    dst.width = w;
    dst.height = h;
    dst.rowBytes = (w * 2);

    vImageConvert_ARGB8888toRGB565(&src, &dst, 0);

    size_t dataSize = 2 * w * h; // RGB565 = 2 5-bit components and 1 6-bit (16 bits/2 bytes)
    NSData *RGB565Data = [NSData dataWithBytes:dst.data length:dataSize];

    free(destData);

    return RGB565Data;
}

- (CGImageRef)CGImage
{
    return [self CGImageForProposedRect:NULL context:[NSGraphicsContext currentContext] hints:nil];
}

CGContextRef CreateARGBBitmapContext (CGImageRef inImage)
{
    CGContextRef    context = NULL;
    CGColorSpaceRef colorSpace;
    void *          bitmapData;
    int             bitmapByteCount;
    int             bitmapBytesPerRow;


    size_t pixelsWide = CGImageGetWidth(inImage);
    size_t pixelsHigh = CGImageGetHeight(inImage);
    bitmapBytesPerRow   = (int)(pixelsWide * 4);
    bitmapByteCount     = (int)(bitmapBytesPerRow * pixelsHigh);

    colorSpace = CGColorSpaceCreateDeviceRGB();
    if (colorSpace == NULL)
        return nil;

    bitmapData = malloc( bitmapByteCount );
    if (bitmapData == NULL)
    {
        CGColorSpaceRelease( colorSpace );
        return nil;
    }
    context = CGBitmapContextCreate (bitmapData,
                                 pixelsWide,
                                 pixelsHigh,
                                 8,
                                 bitmapBytesPerRow,
                                 colorSpace,
                                 kCGImageAlphaPremultipliedFirst);
    if (context == NULL)
    {
        free (bitmapData);
        fprintf (stderr, "Context not created!");
    }
    CGColorSpaceRelease( colorSpace );

    return context;
}
4

3 回答 3

1

在大多数情况下,您将需要使用 Core Image。

您可以使用 CIAffineTransform 过滤器进行旋转。这需要一个 NSAffineTransform 对象。您之前可能已经使用过该课程。(您可以使用 NSImage 本身进行旋转,但使用 Core Image 更容易,并且您可能需要在下一步使用它。)

我不知道您所说的“将一种颜色的像素颜色更改为另一种颜色”是什么意思;这可能意味着很多不同的事情。不过,有可能有一个过滤器。

我也不知道你为什么特别需要 565 数据,但假设你真的需要它,那么你会涉及到这个功能是正确的。使用CIContext 的最低级渲染方法获得 8-bit-per-component ARGB 输出,然后使用该 vImage 函数将其转换为 565 RGB。

于 2013-04-22T08:10:15.133 回答
1

我已经设法通过使用 NSBitmapImageRep 获得了我想要的东西(通过一些 hack 访问它)。如果有人知道这样做的更好方法,请分享。

- (NSBitmapImageRep)bitmap 方法是我的 hack。NSImage 开始时只有一个 NSBitmapImageRep,但是在旋转方法之后添加了一个 CIImageRep,它优先于破坏颜色代码的 NSBitmapImageRep(因为 NSImage 呈现不着色的 CIImageRep)。

BitmapImage.m(NSImage 的子类)

CGContextRef CreateARGBBitmapContext (CGImageRef inImage)
{
    CGContextRef    context = NULL;
    CGColorSpaceRef colorSpace;
    void *          bitmapData;
    int             bitmapByteCount;
    int             bitmapBytesPerRow;


    size_t pixelsWide = CGImageGetWidth(inImage);
    size_t pixelsHigh = CGImageGetHeight(inImage);
    bitmapBytesPerRow   = (int)(pixelsWide * 4);
    bitmapByteCount     = (int)(bitmapBytesPerRow * pixelsHigh);

    colorSpace = CGColorSpaceCreateDeviceRGB();
    if (colorSpace == NULL)
        return nil;

    bitmapData = malloc( bitmapByteCount );
    if (bitmapData == NULL)
    {
        CGColorSpaceRelease( colorSpace );
        return nil;
    }
    context = CGBitmapContextCreate (bitmapData,
                                     pixelsWide,
                                     pixelsHigh,
                                     8,
                                     bitmapBytesPerRow,
                                     colorSpace,
                                     kCGImageAlphaPremultipliedFirst);
    if (context == NULL)
    {
        free (bitmapData);
        fprintf (stderr, "Context not created!");
    }
    CGColorSpaceRelease( colorSpace );

    return context;
}

- (NSData *)RGB565Data
{
    CGContextRef cgctx = CreateARGBBitmapContext(self.CGImage);
    if (cgctx == NULL)
        return nil;

    size_t w = CGImageGetWidth(self.CGImage);
    size_t h = CGImageGetHeight(self.CGImage);
    CGRect rect = {{0,0},{w,h}};
    CGContextDrawImage(cgctx, rect, self.CGImage);

    void *data = CGBitmapContextGetData (cgctx);
    CGContextRelease(cgctx);

    if (!data)
        return nil;

    vImage_Buffer src;
    src.data = data;
    src.width = w;
    src.height = h;
    src.rowBytes = (w * 4);

    void* destData = malloc((w * 2) * h);

    vImage_Buffer dst;
    dst.data = destData;
    dst.width = w;
    dst.height = h;
    dst.rowBytes = (w * 2);

    vImageConvert_ARGB8888toRGB565(&src, &dst, 0);

    size_t dataSize = 2 * w * h; // RGB565 = 2 5-bit components and 1 6-bit (16 bits/2 bytes)
    NSData *RGB565Data = [NSData dataWithBytes:dst.data length:dataSize];

    free(destData);

    return RGB565Data;
}

- (NSBitmapImageRep*)bitmap
{
    NSBitmapImageRep *bitmap = nil;

    NSMutableArray *repsToRemove = [NSMutableArray array];

    // Iterate through the representations that back the NSImage
    for (NSImageRep *rep in self.representations)
    {
        // If the representation is a bitmap
        if ([rep isKindOfClass:[NSBitmapImageRep class]])
        {
            bitmap = [(NSBitmapImageRep*)rep retain];
            break;
        }
        else
        {
            [repsToRemove addObject:rep];
        }
    }

    // If no bitmap representation was found, we create one (this shouldn't occur)
    if (bitmap == nil)
    {
        bitmap = [[[NSBitmapImageRep alloc] initWithCGImage:self.CGImage] retain];
        [self addRepresentation:bitmap];
    }

    for (NSImageRep *rep2 in repsToRemove)
    {
        [self removeRepresentation:rep2];
    }

    return [bitmap autorelease];
}

- (NSColor*)colorAtX:(NSInteger)x y:(NSInteger)y
{
    return [self.bitmap colorAtX:x y:y];
}

- (void)setColor:(NSColor*)color atX:(NSInteger)x y:(NSInteger)y
{
    [self.bitmap setColor:color atX:x y:y];
}

NSImage+Extra.m(NSImage 类别)

- (CGImageRef)CGImage
{
    return [self CGImageForProposedRect:NULL context:[NSGraphicsContext currentContext] hints:nil];
}

用法

- (IBAction)load:(id)sender
{
    NSOpenPanel* openDlg = [NSOpenPanel openPanel];

    [openDlg setCanChooseFiles:YES];

    [openDlg setCanChooseDirectories:YES];

    if ( [openDlg runModalForDirectory:nil file:nil] == NSOKButton )
    {
        NSArray* files = [openDlg filenames];

        for( int i = 0; i < [files count]; i++ )
        {
            NSString* fileName = [files objectAtIndex:i];
            BitmapImage *image = [[BitmapImage alloc] initWithContentsOfFile:fileName];

            imageView.image = image;
        }
    }
}

- (IBAction)colorize:(id)sender
{
    float width = imageView.image.size.width;
    float height = imageView.image.size.height;

    BitmapImage *img = (BitmapImage*)imageView.image;

    NSColor *newColor = [img colorAtX:1 y:1];

    for (int x = 0; x <= width; x++)
    {
        for (int y = 0; y <= height; y++)
        {
            if ([img colorAtX:x y:y] == newColor)
            {
                [img setColor:[NSColor redColor] atX:x y:y];
            }
        }
    }
    [imageView setNeedsDisplay:YES];
}

- (IBAction)rotate:(id)sender
{
    BitmapImage *img = (BitmapImage*)imageView.image;
    BitmapImage *newImg = [img rotate90DegreesClockwise:NO];
    imageView.image = newImg;
}

编辑(2013 年 4 月 24 日)

我更改了以下代码:

- (RGBColor)colorAtX:(NSInteger)x y:(NSInteger)y
{
    NSUInteger components[4];
    [self.bitmap getPixel:components atX:x y:y];
    //NSLog(@"R: %ld, G:%ld, B:%ld", components[0], components[1], components[2]);

    RGBColor color = {components[0], components[1], components[2]};

    return color;
}

- (BOOL)color:(RGBColor)a isEqualToColor:(RGBColor)b
{
    return ((a.red == b.red) && (a.green == b.green) && (a.blue == b.blue));
}

- (void)setColor:(RGBColor)color atX:(NSUInteger)x y:(NSUInteger)y
{
    NSUInteger components[4] = {(NSUInteger)color.red, (NSUInteger)color.green, (NSUInteger)color.blue, 255};

    //NSLog(@"R: %ld, G: %ld, B: %ld", components[0], components[1], components[2]);

    [self.bitmap setPixel:components atX:x y:y];
}

- (IBAction)colorize:(id)sender
{
    float width = imageView.image.size.width;
    float height = imageView.image.size.height;

    BitmapImage *img = (BitmapImage*)imageView.image;

    RGBColor oldColor = [img colorAtX:0 y:0];
    RGBColor newColor;// = {255, 0, 0};
    newColor.red = 255;
    newColor.green = 0;
    newColor.blue = 0;

    for (int x = 0; x <= width; x++)
    {
        for (int y = 0; y <= height; y++)
        {
            if ([img color:[img colorAtX:x y:y] isEqualToColor:oldColor])
            {
                [img setColor:newColor atX:x y:y];
            }
        }
    }
    [imageView setNeedsDisplay:YES];
}

但现在它在第一次调用 colorize 方法时将像素变为红色,然后变为蓝色。

编辑 2 (24/04/2013)

下面的代码修复了它。这是因为旋转代码为 NSBitmapImageRep 添加了一个 alpha 通道。

- (RGBColor)colorAtX:(NSInteger)x y:(NSInteger)y
{
    if (self.bitmap.hasAlpha)
    {
        NSUInteger components[4];
        [self.bitmap getPixel:components atX:x y:y];
        RGBColor color = {components[1], components[2], components[3]};
        return color;
    }
    else
    {
        NSUInteger components[3];
        [self.bitmap getPixel:components atX:x y:y];
        RGBColor color = {components[0], components[1], components[2]};
        return color;
    }
}

- (void)setColor:(RGBColor)color atX:(NSUInteger)x y:(NSUInteger)y
{
    if (self.bitmap.hasAlpha)
    {
        NSUInteger components[4] = {255, (NSUInteger)color.red, (NSUInteger)color.green, (NSUInteger)color.blue};
        [self.bitmap setPixel:components atX:x y:y];
    }
    else
    {
        NSUInteger components[3] = {color.red, color.green, color.blue};
        [self.bitmap setPixel:components atX:x y:y];
    }
}
于 2013-04-23T13:49:58.893 回答
0

好的,我决定花一天时间研究 Peter 关于使用 CoreImage 的建议。我之前做过一些研究,并认为这太难了,但经过一整天的研究,我终于找到了我需要做的事情,令人惊讶的是,这再简单不过了。

早些时候,我决定 Apple ChromaKey Core Image 示例将是一个很好的起点,但示例代码由于 3 维颜色立方体而吓坏了我。在观看了 Core Image 上的 WWDC 2012 视频并在 github ( https://github.com/vhbit/ColorCubeSample ) 上找到了一些示例代码后,我决定加入并试一试。

这是工作代码的重要部分,我还没有包含 RGB565Data 方法,因为我还没有编写它,但是使用 Peter 建议的方法应该很容易:

CIImage+Extras.h

- (NSImage*) NSImage;
- (CIImage*) imageRotated90DegreesClockwise:(BOOL)clockwise;
- (CIImage*) imageWithChromaColor:(NSColor*)chromaColor BackgroundColor:(NSColor*)backColor;
- (NSColor*) colorAtX:(NSUInteger)x y:(NSUInteger)y;

CIImage+Extras.m

- (NSImage*) NSImage
{
    CGContextRef cg = [[NSGraphicsContext currentContext] graphicsPort];
    CIContext *context = [CIContext contextWithCGContext:cg options:nil];
    CGImageRef cgImage = [context createCGImage:self fromRect:self.extent];

    NSImage *image = [[NSImage alloc] initWithCGImage:cgImage size:NSZeroSize];

    return [image autorelease];
}

- (CIImage*) imageRotated90DegreesClockwise:(BOOL)clockwise
{
    CIImage *im = self;
    CIFilter *f = [CIFilter filterWithName:@"CIAffineTransform"];
    NSAffineTransform *t = [NSAffineTransform transform];
    [t rotateByDegrees:clockwise ? -90 : 90];
    [f setValue:t forKey:@"inputTransform"];
    [f setValue:im forKey:@"inputImage"];
    im = [f valueForKey:@"outputImage"];

    CGRect extent = [im extent];
    f = [CIFilter filterWithName:@"CIAffineTransform"];
    t = [NSAffineTransform transform];
    [t translateXBy:-extent.origin.x
                yBy:-extent.origin.y];
    [f setValue:t forKey:@"inputTransform"];
    [f setValue:im forKey:@"inputImage"];
    im = [f valueForKey:@"outputImage"];

    return im;
}

- (CIImage*) imageWithChromaColor:(NSColor*)chromaColor BackgroundColor:(NSColor*)backColor
{
    CIImage *im = self;

    CIColor *backCIColor = [[CIColor alloc] initWithColor:backColor];
    CIImage *backImage = [CIImage imageWithColor:backCIColor];
    backImage = [backImage imageByCroppingToRect:self.extent];
    [backCIColor release];
    float chroma[3];

    chroma[0] = chromaColor.redComponent;
    chroma[1] = chromaColor.greenComponent;
    chroma[2] = chromaColor.blueComponent;

    // Allocate memory
    const unsigned int size = 64;
    const unsigned int cubeDataSize = size * size * size * sizeof (float) * 4;
    float *cubeData = (float *)malloc (cubeDataSize);
    float rgb[3];//, *c = cubeData;

    // Populate cube with a simple gradient going from 0 to 1
    size_t offset = 0;
    for (int z = 0; z < size; z++){
        rgb[2] = ((double)z)/(size-1); // Blue value
        for (int y = 0; y < size; y++){
            rgb[1] = ((double)y)/(size-1); // Green value
            for (int x = 0; x < size; x ++){
                rgb[0] = ((double)x)/(size-1); // Red value
                float alpha = ((rgb[0] == chroma[0]) && (rgb[1] == chroma[1]) && (rgb[2] == chroma[2])) ? 0.0 : 1.0;

                cubeData[offset]   = rgb[0] * alpha;
                cubeData[offset+1] = rgb[1] * alpha;
                cubeData[offset+2] = rgb[2] * alpha;
                cubeData[offset+3] = alpha;

                offset += 4;
            }
        }
    }

    // Create memory with the cube data
    NSData *data = [NSData dataWithBytesNoCopy:cubeData
                                        length:cubeDataSize
                                  freeWhenDone:YES];
    CIFilter *colorCube = [CIFilter filterWithName:@"CIColorCube"];
    [colorCube setValue:[NSNumber numberWithInt:size] forKey:@"inputCubeDimension"];
    // Set data for cube
    [colorCube setValue:data forKey:@"inputCubeData"];

    [colorCube setValue:im forKey:@"inputImage"];
    im = [colorCube valueForKey:@"outputImage"];

    CIFilter *sourceOver = [CIFilter filterWithName:@"CISourceOverCompositing"];
    [sourceOver setValue:im forKey:@"inputImage"];
    [sourceOver setValue:backImage forKey:@"inputBackgroundImage"];

    im = [sourceOver valueForKey:@"outputImage"];

    return im;
}

- (NSColor*)colorAtX:(NSUInteger)x y:(NSUInteger)y
{
    NSBitmapImageRep* bitmap = [[NSBitmapImageRep alloc] initWithCIImage:self];
    NSColor *color = [bitmap colorAtX:x y:y];
    [bitmap release];
    return color;
}
于 2013-04-24T10:20:32.590 回答