0

我正在尝试创建一个AVAssetWriter屏幕捕获 openGL 项目。我从来没有写过AVAssetWriterAVAssetWriterInputPixelBufferAdaptor,所以我不确定我是否做对了。

- (id) initWithOutputFileURL:(NSURL *)anOutputFileURL {
    if ((self = [super init])) {
        NSError *error;
        movieWriter = [[AVAssetWriter alloc] initWithURL:anOutputFileURL fileType:AVFileTypeMPEG4 error:&error];
        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:640], AVVideoWidthKey,
                                       [NSNumber numberWithInt:480], AVVideoHeightKey,
                                       nil];
        writerInput = [[AVAssetWriterInput
                        assetWriterInputWithMediaType:AVMediaTypeVideo
                        outputSettings:videoSettings] retain];
        writer = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:writerInput sourcePixelBufferAttributes:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,nil]];

        [movieWriter addInput:writerInput];
        writerInput.expectsMediaDataInRealTime = YES;
    }

    return self;
}

课程的其他部分:

- (void)getFrame:(CVPixelBufferRef)SampleBuffer:(int64_t)frame{
    frameNumber = frame;
    [writer appendPixelBuffer:SampleBuffer withPresentationTime:CMTimeMake(frame, 24)]; 
}

- (void)startRecording {
   [movieWriter startWriting];
   [movieWriter startSessionAtSourceTime:kCMTimeZero];
}

- (void)stopRecording {
   [writerInput markAsFinished];
   [movieWriter endSessionAtSourceTime:CMTimeMake(frameNumber, 24)];
   [movieWriter finishWriting];
}

资产编写者由以下人员发起:

    NSURL *outputFileURL = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"]];
    recorder = [[GLRecorder alloc] initWithOutputFileURL:outputFileURL];

视图是这样记录的:

    glReadPixels(0, 0, 480, 320, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
    for(int y = 0; y <320; y++) {
    for(int x = 0; x <480 * 4; x++) {
        int b2 = ((320 - 1 - y) * 480 * 4 + x);
        int b1 = (y * 4 * 480 + x);
        buffer2[b2] = buffer[b1];
    }
}    
pixelBuffer = NULL;
CVPixelBufferCreateWithBytes (NULL,480,320,kCVPixelFormatType_32BGRA,buffer2,1920,NULL,0,NULL,&pixelBuffer);
[recorder getFrame:pixelBuffer :framenumber];
    framenumber++;

笔记:

pixelBuffer是一个CVPixelBufferRef
framenumber是一个int64_t
buffer并且buffer2GLubyte

我没有收到任何错误,但是当我完成录制时没有文件。任何帮助或帮助链接将不胜感激。opengl 有来自相机的实时信息。我已经能够将屏幕另存为UIImage但想要获得我创建的电影。

4

1 回答 1

0

如果您正在编写 RGBA 帧,我认为您可能需要使用 aAVAssetWriterInputPixelBufferAdaptor将它们写出来。这个类应该管理一个像素缓冲区池,但我的印象是它实际上将你的数据按摩到 YUV 中。

如果这可行,那么我想你会发现你的颜色都被交换了,此时你可能必须编写像素着色器来将它们转换为 BGRA。或者(颤抖)在 CPU 上执行此操作。由你决定。

于 2011-11-11T11:36:43.643 回答