3

我每 N 个视频帧从CMSampleBufferRef视频缓冲区获取一个UIImage ,例如:

- (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion {
    CMSampleBufferRef sampleBuffer = _myLastSampleBuffer;
    if (sampleBuffer != nil) {
        CFRetain(sampleBuffer);
        CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
        _lastAppendedVideoBuffer.sampleBuffer = nil;
        if (_context == nil) {
            _context = [CIContext contextWithOptions:nil];
        }
        CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CGImageRef cgImage = [_context createCGImage:ciImage fromRect:
                              CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))];
        __block UIImage *image = [UIImage imageWithCGImage:cgImage];

        CGImageRelease(cgImage);
        CFRelease(sampleBuffer);

        if(completion) completion(image);

        return;
    }
    if(completion) completion(nil);
}

XCode 和 Instruments 检测到内存泄漏,但我无法摆脱它。我像往常一样发布 CGImageRef 和 CMSampleBufferRef :

CGImageRelease(cgImage);
CFRelease(sampleBuffer);

[更新] 我输入了AVCapture输出回调来获取sampleBuffer.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if (captureOutput == _videoOutput) {
        _lastVideoBuffer.sampleBuffer = sampleBuffer;
        id<CIImageRenderer> imageRenderer = _CIImageRenderer;

        dispatch_async(dispatch_get_main_queue(), ^{
            @autoreleasepool {
                CIImage *ciImage = nil;
                ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
                if(_context==nil) {
                    _context = [CIContext contextWithOptions:nil];
                }
                CGImageRef processedCGImage = [_context createCGImage:ciImage
                                                             fromRect:[ciImage extent]];
                //UIImage *image=[UIImage imageWithCGImage:processedCGImage];
                CGImageRelease(processedCGImage);
                NSLog(@"Captured image %@", ciImage);
            }
        });

泄漏的代码是createCGImage:ciImage

CGImageRef processedCGImage = [_context createCGImage:ciImage
                                                             fromRect:[ciImage extent]];

甚至有一个autoreleasepool,CGImageReleaseCGImage引用和一个CIContext作为实例属性。

这似乎与此处解决的问题相同:Can't save CIImage to file on iOS without memory leaks

[更新]泄漏似乎是由于一个错误。这个问题 在 iOS 9 的 CIContext createCGImage 上的内存泄漏中有很好的描述?

一个示例项目显示了如何重现此泄漏:http ://www.osamu.co.jp/DataArea/VideoCameraTest.zip

最后的评论保证

看起来他们在 9.1b3 中修复了这个问题。如果有人需要适用于 iOS 9.0.x 的解决方法,我可以使用它:

在测试代​​码中(在这种情况下为 Swift):

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)  
    {  
        if (error) return;  

        __block NSString *filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"ipdf_pic_%i.jpeg",(int)[NSDate date].timeIntervalSince1970]];  

        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];  
        dispatch_async(dispatch_get_main_queue(), ^  
        {  

            @autoreleasepool  
            {  
                CIImage *enhancedImage = [CIImage imageWithData:imageData];  

                if (!enhancedImage) return;  

                static CIContext *ctx = nil; if (!ctx) ctx = [CIContext contextWithOptions:nil];  

                CGImageRef imageRef = [ctx createCGImage:enhancedImage fromRect:enhancedImage.extent format:kCIFormatBGRA8 colorSpace:nil];  

                UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationRight];  

                [[NSFileManager defaultManager] createFileAtPath:filePath contents:UIImageJPEGRepresentation(image, 0.8) attributes:nil];  

                CGImageRelease(imageRef);  
            }  
        });  
    }]; 

iOS9.0 的解决方法应该是

extension CIContext {  
    func createCGImage_(image:CIImage, fromRect:CGRect) -> CGImage {  
        let width = Int(fromRect.width)  
        let height = Int(fromRect.height)  

        let rawData =  UnsafeMutablePointer<UInt8>.alloc(width * height * 4)  
        render(image, toBitmap: rawData, rowBytes: width * 4, bounds: fromRect, format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB())  
        let dataProvider = CGDataProviderCreateWithData(nil, rawData, height * width * 4) {info, data, size in UnsafeMutablePointer<UInt8>(data).dealloc(size)}  
        return CGImageCreate(width, height, 8, 32, width * 4, CGColorSpaceCreateDeviceRGB(), CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue), dataProvider, nil, false, .RenderingIntentDefault)!  
    }  
}  
4

2 回答 2

3

我们在创建的应用程序中遇到了类似的问题,我们正在使用 OpenCV 处理特征关键点的每一帧,并每隔几秒钟发送一帧。运行一段时间后,我们最终会收到很多内存压力消息。

我们设法通过在它自己的自动释放池中运行我们的处理代码来纠正这个问题(jpegDataFromSampleBufferAndCrop 做的事情类似于你正在做的事情,增加了裁剪):

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
        @autoreleasepool {

            if ([self.lastFrameSentAt timeIntervalSinceNow] < -kContinuousRateInSeconds) {

                NSData *imageData = [self jpegDataFromSampleBufferAndCrop:sampleBuffer];

                if (imageData) {
                    [self processImageData:imageData];
                }

                self.lastFrameSentAt = [NSDate date];

                imageData = nil;
            }
        }
    }
}
于 2015-09-21T02:08:52.347 回答
1

我可以确认 iOS 9.2 上仍然存在这种内存泄漏。(我也在Apple Developer Forum上发过帖子。)

我在 iOS 9.2 上遇到了同样的内存泄漏。我已经使用 MetalKit 和 MLKDevice 测试了删除 EAGLContext。我已经使用不同的 CIContext 方法进行了测试,例如 drawImage、createCGImage 和渲染,但似乎没有任何效果。

很明显,这是 iOS 9 的一个错误。通过从 Apple 下载示例应用程序(见下文)自行尝试,然后在 iOS 8.4 的设备上运行相同的项目,然后在 iOS 9.2 的设备上运行并注意 Xcode 中的内存条。

下载https://developer.apple.com/library/ios/samplecode/AVBasicVideoOutput/Introduction/Intro.html#//apple_ref/doc/uid/DTS40013109

将此添加到 APLEAGLView.h:20

@property (strong, nonatomic) CIContext* ciContext;

用这个替换 APLEAGLView.m:118

[EAGLContext setCurrentContext:_context];
 _ciContext = [CIContext contextWithEAGLContext:_context];

最后用这个替换 APLEAGLView.m:341-343

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);  

    @autoreleasepool  
    {  
        CIImage* sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];  
        CIFilter* filter = [CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:kCIInputImageKey, sourceImage, nil];  
        CIImage* filteredImage = filter.outputImage;  

        [_ciContext render:filteredImage toCVPixelBuffer:pixelBuffer];  
    }  

glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);  
于 2015-12-26T11:27:13.143 回答