I scanned an image (.tiff
) with Macintosh millions of colors
which means 24 bits per pixel
. The scanned image that I am getting has the following attributes: size = 330KB
and dimensions = 348 * 580 pixels
. Since, there are 24 bits per pixel, the size should actually be 348 * 580 * 3 = 605KB
.
Is there something incorrect? I also used this code to extract the image raw data from the url of the image scanned:
NSString * urlName = [url path];
NSImage *image = [[NSImage alloc] initWithContentsOfFile:urlName];
NSData *imageData = [image TIFFRepresentation];
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)CFBridgingRetain(imageData), NULL);
CGImageRef imageRef = CGImageSourceCreateImageAtIndex(source, 0, NULL);
NSUInteger numberOfBitsPerPixel = CGImageGetBitsPerPixel(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
NSUInteger width = CGImageGetWidth(imageRef);
From this code also, I am getting the same info about the width, height and number of bits per pixel in the image.
Basically, I have to use this image's information and reproduce it somewhere else in some other form, so if I'll not be able to get the correct the information, the final product is not reproducible. What can be wrong here?
P.S.: If some other information is needed to answer the question, then I'll be happy to provide that.