1

I am trying to integrate the glGrab code for screen capture on Mac OS under mentioned config and I am currently stuck at an all blue screen being rendered inside my window. I believe that there is some issue with how the image texture has been created, but can't tell what. I am just a couple of weeks old in OpenGL so please go easy on me if I missed something obvious.

I am using the glGrab code as it is except CGLSetFullScreen method (and not even CGLSetFullScreenOnDisplay) because these methods are now deprecated. So this one line of code has been commented out for the time being.

I have been doing some research on this topic since some time now and found another thread on stackoverflow which possibly could have been the complete answer, but it helped much nonetheless. Convert UIImage to CVImageBufferRef

A direct reference to the glGrab code is http://code.google.com/p/captureme/source/browse/trunk/glGrab.c

4

1 回答 1

1

我的上述问题的答案如下。所以没有更多的opengl或glGrab。使用最适合 Mac OSX 的软件。这也不包括捕获鼠标指针的代码,但我相信如果你已经登陆这个页面,你会足够聪明,可以自己弄清楚。或者,如果阅读本文的人知道解决方案,那么这是您帮助兄弟会的机会 :) 此代码还返回一个 CVPixelBufferRef。您可以选择按原样发回 CGImageRef 甚至字节流,只需根据自己的喜好进行调整即可。:

void swizzleBitmap(void *data, int rowBytes, int height) {
    int top, bottom;
    void * buffer;
    void * topP;
    void * bottomP;
    void * base;

    top = 0;
    bottom = height - 1;
    base = data;
    buffer = malloc(rowBytes);

    while (top < bottom) {
        topP = (void *)((top * rowBytes) + (intptr_t)base);
        bottomP = (void *)((bottom * rowBytes) + (intptr_t)base);

        bcopy( topP, buffer, rowBytes );
        bcopy( bottomP, topP, rowBytes );
        bcopy( buffer, bottomP, rowBytes );

        ++top;
        --bottom;
    }   
    free(buffer);
}   

CVImageBufferRef grabViaOpenGL() {
    int bytewidth;

    CGImageRef image = CGDisplayCreateImage(kCGDirectMainDisplay);    // Main screenshot capture call

    CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image));    // Get screenshot bounds

    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                            [NSNumber numberWithBool:NO], kCVPixelBufferCGImageCompatibilityKey,
                            [NSNumber numberWithBool:NO], kCVPixelBufferCGBitmapContextCompatibilityKey,
                            nil];

    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
                                          frameSize.height,  kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
                                          &pxbuffer);


    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
                                                 frameSize.height, 8, 4*frameSize.width, rgbColorSpace,
                                                 kCGImageAlphaNoneSkipLast);

    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);

    bytewidth = frameSize.width * 4; // Assume 4 bytes/pixel for now
    bytewidth = (bytewidth + 3) & ~3; // Align to 4 bytes
    swizzleBitmap(pxdata, bytewidth, frameSize.height);     // Solution for ARGB madness

    CGColorSpaceRelease(rgbColorSpace);
    CGImageRelease(image);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}
于 2012-07-12T12:34:57.723 回答