我正在使用带有单个视频输入的 AVAssetWriter 将视频帧写入电影。写循环的机制不是问题:但我发现内存管理是。
关于 CMSampleBufferRef,我要附加:a)我从一些图像数据(原始字节)创建一个 CVPixelBufferRef。因此,CVPBR 拥有数据。b)然后我像这样将其包装在 CMSampleBufferRef 中(为简洁起见,删除了错误检查):
+ (CMSampleBufferRef)wrapImageBufferInCMSampleBuffer:(CVImageBufferRef)imageBuffer timingInfo:(CMSampleTimingInfo const *)timingInfo error:(NSError **)error {
// Create a format description for the pixel buffer
CMVideoFormatDescriptionRef formatDescription = 0;
OSStatus result = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, imageBuffer, &formatDescription);
// Finally, create a CMSampleBuffer wrapper around it all
CMSampleBufferRef sampleBuffer = nil;
OSStatus sampleBufferResult = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, imageBuffer, YES, nil, nil, formatDescription, timingInfo, &sampleBuffer);
if (formatDescription) {
CFRelease(formatDescription);
}
return sampleBuffer;
}
这为我创建了一个可以传递给 AVWriter 的缓冲区。到目前为止,一切都很好。问题是,如果我在将缓冲区附加到写入器后简单地释放缓冲区,我会看到奇怪的失真。
如果我这样做:
[avInput appendSampleBuffer:delayedBuffer]
CFRelease(sampleBuffer);
CFRelease(pixelBuffer);
然后它工作,没有内存泄漏,但偶尔我会看到损坏的帧。注意有时不同步的帧,以及在第 391,394 帧处明显的内存损坏。对我来说,看起来内存缓冲区在 AVF 完成编码之前就被释放了。
如果我删除 CFRelease(pixelBuffer),问题就会消失。生成的电影非常流畅,完全没有损坏。当然; 那么我有一个多 GB 内存泄漏的问题!
有没有其他人遇到过这样的事情?
顺便说一句:我是否使用 AVAssetWriterInputPixelBufferAdaptor 也没关系。得到相同的结果。
这是重现问题的完整代码片段:
- (void)recordMovieUsingStandardAVFTo:(NSURL *)url colorSequence:(NSArray *)colorSequence frameDuration:(CMTime)frameDuration size:(NSSize)size {
NSError *error = nil;
AVAssetWriter *writer = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeQuickTimeMovie error:&error];
if ([self checkForError:error doing:@"creation of asset writer"] == NO) {
return;
}
NSMutableDictionary *videoSettings = [NSMutableDictionary dictionary];
[videoSettings setValue:[NSNumber numberWithLong:(long) size.width] forKey:AVVideoWidthKey];
[videoSettings setValue:[NSNumber numberWithLong:(long) size.height] forKey:AVVideoHeightKey];
[videoSettings setValue:AVVideoCodecH264 forKey:AVVideoCodecKey];
AVAssetWriterInput *videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
[writer addInput:videoInput];
[writer startWriting];
[writer startSessionAtSourceTime:kCMTimeZero];
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(queue, ^{
NSError *localError = nil;
CMTime frameTime = kCMTimeZero;
int frameCounter = 0;
for (int i = 0; i < 4; i++) {
@autoreleasepool {
for (NSColor *color in colorSequence) {
CMSampleTimingInfo timing = kCMTimingInfoInvalid;
timing.presentationTimeStamp = frameTime;
timing.duration = frameDuration;
while (videoInput.isReadyForMoreMediaData == NO) {
[NSThread sleepForTimeInterval:0.1];
}
CVPixelBufferRef pixelBuffer = [self createPixelBufferBufferOfColor:color size:size];
CMSampleBufferRef sampleBuffer = [OSGUtils wrapImageBufferInCMSampleBuffer:pixelBuffer timingInfo:&timing error:&localError];
BOOL recordingSuccess = [videoInput appendSampleBuffer:sampleBuffer];
if (recordingSuccess) {
frameTime = CMTimeAdd(frameTime, frameDuration);
frameCounter++;
if (frameCounter % 60 == 0) {
ApplicationLogInfo(@"Wrote frame at time %@", [NSString niceStringForCMTime:frameTime]);
}
} else {
ApplicationLogError(@"Can't write frame at time %@", localError);
}
CFRelease(sampleBuffer);
CFRelease(pixelBuffer);
}
}
}
[videoInput markAsFinished];
[writer endSessionAtSourceTime:frameTime];
BOOL success = [writer finishWriting];
if (!success) {
ApplicationLogError(@"Failed to finish writing, %@, %d", writer.error, writer.status);
} else {
ApplicationLogInfo(@"Write complete");
}
});
}