0

我从相机输出中获取 CVImageBufferRef ,然后使用 VideoToolbox 将其转换为CGImageRef然后其转换为UIImage

奇怪的是....当我检查UIImage的大小时,它打印的是像素宽度而不是点宽度。

又名...CGImageGetWidth(image) == imageToCrop.size.width

显然这是不正确的,因为UIImage的大小应该以磅为单位。

https://developer.apple.com/documentation/uikit/uiimage/1624105-size?language=objc

图像的逻辑尺寸,以点为单位。

运行 iOS 11.1

CVImageBufferRef lastPixelBuffer; // assume this is filled already
...
CGImageRef image;
(void)VTCreateCGImageFromCVPixelBuffer(lastPixelBuffer, NULL, &image);
UIImage *imageToCrop = [UIImage imageWithCGImage:workingImage];

NSLog(@"image.width = %zu", CGImageGetWidth(image));
NSLog(@"photoToCrop.size.width = %f", photoToCrop.size.width);
NSLog(@"photoToCrop.scale = %f", photoToCrop.scale);
NSLog(@"underlying CGimageRef, width = %zu", CGImageGetWidth(photoToCrop.CGImage));

2017-11-02 16:16:00.895449-0400 TestBed[3950:1202023] image.width = 1080

2017-11-02 16:16:00.895515-0400 TestBed[3950:1202023] photoToCrop.size.width = 1080.000000

2017-11-02 16:16:00.895558-0400 TestBed[3950:1202023] photoToCrop.scale = 1.000000

2017-11-02 16:16:00.895663-0400 TestBed[3950:1202023] 底层 CGimageRef,宽度 = 1080

4

0 回答 0