2

我使用 glReadPixels 来抓取我的 opengl 场景的屏幕截图,然后在 IOS 4 上使用 AVAssetWriter 将它们转换为视频。我的问题是我需要将 alpha 通道传递给仅接受 kCVPixelFormatType_32ARGB 和 glReadPixels 检索 RGBA 的视频。所以基本上我需要一种将我的 RGBA 转换为 ARGB 的方法,换句话说,将 alpha 字节放在首位。

int depth = 4;
unsigned char buffer[width * height * depth];  
glReadPixels(0,0,width, height, GL_RGBA, GL_UNSIGNED_BYTE, &buffer);

CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, &buffer), width*height*depth, NULL );

CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault | kCGImageAlphaPremultipliedLast;

CGImageRef image = CGImageCreate(width, height, 8, 32, width*depth, CGColorSpaceCreateDeviceRGB(), bitmapInfo, ref, NULL, true, kCGRenderingIntentDefault);

UIWindow* parentWindow = [self window];

NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];

CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);

NSParameterAssert(status == kCVReturnSuccess);
NSParameterAssert(pxbuffer != NULL);

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, width, height, 8, depth*width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);

NSParameterAssert(context);

CGContextConcatCTM(context, parentWindow.transform);
CGContextDrawImage(context, CGRectMake(0, 0, width, height), image);

CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer; // chuck pixel buffer into AVAssetWriter

以为我会发布整个代码,因为我可以帮助别人。

干杯

4

6 回答 6

6

注意:我假设每个通道 8 位。如果不是这种情况,请进行相应调整。

要最后移动 alpha 位,您需要执行rotation。这通常通过位移最容易地表达。

在这种情况下,您希望将 RGB 位向右移动 8 位,将 A 位向左移动 24 位。然后应该使用按位或将这两个值放在一起,这样就变成了argb = (rgba >> 8) | (rgba << 24)

于 2010-10-16T17:43:31.357 回答
2

更好的是,不要使用 ARGB 对视频进行编码,而是发送 AVAssetWriter BGRA 帧。正如我在这个答案中所描述的那样,这样做可以让您在 iPhone 4 上以 30 FPS 的速度对 640x480 视频进行编码,而对 720p 视频则以高达 20 FPS 的速度进行编码。iPhone 4S 可以使用它以 30 FPS 的速度播放 1080p 视频。

此外,您需要确保使用像素缓冲池,而不是每次都重新创建像素缓冲区。复制该答案中的代码,您可以使用以下命令配置 AVAssetWriter:

NSError *error = nil;

assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType:AVFileTypeAppleM4V error:&error];
if (error != nil)
{
    NSLog(@"Error: %@", error);
}


NSMutableDictionary * outputSettings = [[NSMutableDictionary alloc] init];
[outputSettings setObject: AVVideoCodecH264 forKey: AVVideoCodecKey];
[outputSettings setObject: [NSNumber numberWithInt: videoSize.width] forKey: AVVideoWidthKey];
[outputSettings setObject: [NSNumber numberWithInt: videoSize.height] forKey: AVVideoHeightKey];


assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
assetWriterVideoInput.expectsMediaDataInRealTime = YES;

// You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                                                       [NSNumber numberWithInt:videoSize.width], kCVPixelBufferWidthKey,
                                                       [NSNumber numberWithInt:videoSize.height], kCVPixelBufferHeightKey,
                                                       nil];

assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];

[assetWriter addInput:assetWriterVideoInput];

然后使用此代码使用以下代码抓取每个渲染帧glReadPixels()

CVPixelBufferRef pixel_buffer = NULL;

CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &pixel_buffer);
if ((pixel_buffer == NULL) || (status != kCVReturnSuccess))
{
    return;
}
else
{
    CVPixelBufferLockBaseAddress(pixel_buffer, 0);
    GLubyte *pixelBufferData = (GLubyte *)CVPixelBufferGetBaseAddress(pixel_buffer);
    glReadPixels(0, 0, videoSize.width, videoSize.height, GL_RGBA, GL_UNSIGNED_BYTE, pixelBufferData);
}

// May need to add a check here, because if two consecutive times with the same value are added to the movie, it aborts recording
CMTime currentTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:startTime],120);

if(![assetWriterPixelBufferInput appendPixelBuffer:pixel_buffer withPresentationTime:currentTime]) 
{
    NSLog(@"Problem appending pixel buffer at time: %lld", currentTime.value);
} 
else 
{
//        NSLog(@"Recorded pixel buffer at time: %lld", currentTime.value);
}
CVPixelBufferUnlockBaseAddress(pixel_buffer, 0);

CVPixelBufferRelease(pixel_buffer);

使用时glReadPixels(),您需要调整帧的颜色,因此我使用了屏幕外 FBO 和片段着色器以及以下代码来执行此操作:

 varying highp vec2 textureCoordinate;

 uniform sampler2D inputImageTexture;

 void main()
 {
     gl_FragColor = texture2D(inputImageTexture, textureCoordinate).bgra;
 }

glReadPixels()但是,在 iOS 5.0 上获取 OpenGL ES 内容的方法比我在这个答案中描述的更快。这个过程的好处是纹理已经以 BGRA 像素格式存储内容,因此您可以直接将封装的像素缓冲区提供给 AVAssetWriter,而无需进行任何颜色转换,并且仍然可以看到出色的编码速度。

于 2012-04-26T16:00:43.457 回答
2

我意识到这个问题已经得到解答,但我想确保人们了解 vImage,它是 Accelerate 框架的一部分,可在 iOS 和 OSX 中使用。我的理解是 Core Graphics 使用 vImage 对位图进行 CPU 绑定的矢量操作。

您想要将 ARGB 转换为 RGBA 的特定 API 是 vImagePermuteChannels_ARGB8888。还有一些 API 可以将 RGB 转换为 ARGB/XRGB、翻转图像、覆盖通道等等。这是一种隐藏的宝石!

更新:布拉德·拉尔森在这里对基本相同的问题写了一个很好的答案。

于 2013-03-26T03:21:08.407 回答
0

我确信可以忽略 alpha 值。因此,您只需memcpy将像素缓冲区数组移动一个字节即可:

void *buffer = malloc(width*height*4);
glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, &buffer);
…
memcpy(pxdata + 1, buffer, width*height*4 - 1);
于 2011-12-13T21:04:53.273 回答
0

是的,每个通道有 8 位,所以它是这样的:

int depth = 4;
int width = 320;
int height = 480;

unsigned char buffer[width * height * depth]; 

glReadPixels(0,0,width, height, GL_RGBA, GL_UNSIGNED_BYTE, &buffer);

for(int i = 0; i < width; i++){
   for(int j = 0; j < height; j++){     
    buffer[i*j] = (buffer[i*j] >> 8) | (buffer[i*j] << 24);
    }
}

我似乎无法让它工作

于 2010-10-16T18:15:22.693 回答
-1
+ (UIImage *) createARGBImageFromRGBAImage: (UIImage *)image {
    CGSize dimensions = [image size];

    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * dimensions.width;
    NSUInteger bitsPerComponent = 8;

    unsigned char *rgba = malloc(bytesPerPixel * dimensions.width * dimensions.height);
    unsigned char *argb = malloc(bytesPerPixel * dimensions.width * dimensions.height);

    CGColorSpaceRef colorSpace = NULL;
    CGContextRef context = NULL;

    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(rgba, dimensions.width, dimensions.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault); // kCGBitmapByteOrder32Big
    CGContextDrawImage(context, CGRectMake(0, 0, dimensions.width, dimensions.height), [image CGImage]);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    for (int x = 0; x < dimensions.width; x++) {
        for (int y = 0; y < dimensions.height; y++) {
            NSUInteger offset = ((dimensions.width * y) + x) * bytesPerPixel;
            argb[offset + 0] = rgba[offset + 3];
            argb[offset + 1] = rgba[offset + 0];
            argb[offset + 2] = rgba[offset + 1];
            argb[offset + 3] = rgba[offset + 2];
        }
    }

    colorSpace = CGColorSpaceCreateDeviceRGB();
    context = CGBitmapContextCreate(argb, dimensions.width, dimensions.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrderDefault); // kCGBitmapByteOrder32Big
    CGImageRef imageRef = CGBitmapContextCreateImage(context);
    image = [UIImage imageWithCGImage: imageRef];
    CGImageRelease(imageRef);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    free(rgba);
    free(argb);

    return image;
}
于 2011-01-18T07:23:59.337 回答