0

我正在做一些需要浮点灰度图像数据的图像处理工作。下面的方法-imagePlanarFData:目前是我如何从输入中提取这些数据CIImage

- (NSData *)byteswapPlanarFData:(NSData *)data
                    swapInPlace:(BOOL)swapInPlace;
{
    NSData * outputData = swapInPlace ? data : [data mutableCopy];
    const int32_t * image = [outputData bytes];
    size_t length = [outputData length] / sizeof(*image);
    for (int i = 0;
         i < length;
         i++) {
        int32_t * val = (int32_t *)&image[i];
        *val = OSSwapBigToHostInt32(*val);
    }
    return outputData;
}

- (NSData *)imagePlanarFData:(CIImage *)processedImage;
{
    NSSize size = [processedImage extent].size;
    if (size.width == 0) {
        return nil;
    }
    dispatch_once(&_onceToken, ^ {
        _colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericGray);
        _bytesPerRow = size.width * sizeof(float);
        _cgx = CGBitmapContextCreate(NULL, size.width, size.height, 32, _bytesPerRow, _colorSpace, kCGImageAlphaNone | kCGBitmapFloatComponents);
        // Work-around for CIImage drawing EXC_BAD_ACCESS when running with Guard Malloc;
        // see <http://stackoverflow.com/questions/11689233/ciimage-drawing-exc-bad-access>
        NSDictionary * options = nil;
        if (getenv("MallocStackLogging") || getenv("MallocStackLoggingNoCompact")) {
            NSLog(@"Forcing CIImageContext to use software rendering; see %@",
                  @"<http://stackoverflow.com/questions/11689233/ciimage-drawing-exc-bad-access>");
            options = @{ kCIContextUseSoftwareRenderer: @YES };
        }
        _cix = [CIContext contextWithCGContext:_cgx
                                       options:options];
        _rect = CGRectMake(0, 0, size.width, size.height);
    });
    float * data = CGBitmapContextGetData(_cgx);
    CGContextClearRect(_cgx, _rect);
    [_cix drawImage:processedImage
             inRect:_rect
           fromRect:_rect];
    NSData * pixelData = [NSData dataWithBytesNoCopy:data
                                              length:_bytesPerRow * size.height
                                        freeWhenDone:NO];
    // For whatever bizarre reason, CoreGraphics uses big-endian floats (!)
    return [self byteswapPlanarFData:pixelData swapInPlace:NO];
}

正如评论中提到的,我很惊讶地看到浮点像素数据以大端顺序从 CGBitmapContext 中出来。(我只是通过反复试验确定了这一点。)因此-byteswapPlanarFData:swapInPlace:,引入了额外的方法,世界似乎一切都很好......有一段时间。

现在我想将这些处理后的数据渲染CGImage.

以前我的代码采用float * data上面提取的缓冲区并直接使用它来呈现一个新的CGImage,然后将其包装在一个 中NSImage,如下所示:

CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, data, bytesPerRow * size.height, NULL);
CGImageRef renderedImage = CGImageCreate(size.width, size.height, 32, 32, bytesPerRow, colorSpace, kCGImageAlphaNone | kCGBitmapFloatComponents, provider, NULL, false, kCGRenderingIntentDefault);
CGDataProviderRelease(provider);
NSImage * image = [[NSImage alloc] initWithCGImage:renderedImage
                                              size:size];
CGImageRelease(renderedImage);

所以现在我正在这样做:

pixelData = [mumble imagePlanarFData:processedImage];
// Swap data back to how it was before...
pixelData = [mumble byteswapPlanarFData:pixelData
                            swapInPlace:YES];
float * data = (float *)[pixelData bytes];
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, data, bytesPerRow * size.height, NULL);
CGImageRef renderedImage = CGImageCreate(size.width, size.height, 32, 32, bytesPerRow, colorSpace, kCGImageAlphaNone | kCGBitmapFloatComponents, provider, NULL, false, kCGRenderingIntentDefault);
CGDataProviderRelease(provider);
NSImage * image = [[NSImage alloc] initWithCGImage:renderedImage
                                              size:size];
CGImageRelease(renderedImage);

以上不起作用。相反,我从 CoreGraphics 收到损坏的图像和一连串的投诉:

Mar 21 05:56:46 aoide.local LabCam[34235] <Error>: CMMConvLut::ConvertFloat 1 input (inf)
Mar 21 05:56:46 aoide.local LabCam[34235] <Error>: ApplySequenceToBitmap failed (-171)
Mar 21 05:56:46 aoide.local LabCam[34235] <Error>: ColorSyncTransformConvert - failed width = 160 height = 199 dstDepth = 7 dstLayout = 0 dstBytesPerRow = 1920 srcDepth = 7 srcLayout = 0 srcBytesPerRow = 640

我哪里出错了?

4

1 回答 1

0

啊,发现问题了...原来在我添加字节交换调用之前有一个剩余的例程,-imagePlanarFData:也在就地交换字节...

仍然想听听一些解释,为什么 CoreGraphics 在我们的 little-endian 平台上期望这些值以 big-endian 字节顺序排列。

于 2013-03-21T14:20:33.907 回答