OStack程序员社区-中国程序员成长平台

标题: ios - CMSampleBufferGetImageBuffer 中的内存泄漏 [打印本页]

作者: 菜鸟教程小白    时间: 2022-12-13 11:42
标题: ios - CMSampleBufferGetImageBuffer 中的内存泄漏

我每 N 个视频帧从 CMSampleBufferRef 视频缓冲区中获取一个 UIImage,例如:

- (void)imageFromVideoBuffervoid(^)(UIImage* image))completion {
    CMSampleBufferRef sampleBuffer = _myLastSampleBuffer;
    if (sampleBuffer != nil) {
        CFRetain(sampleBuffer);
        CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
        _lastAppendedVideoBuffer.sampleBuffer = nil;
        if (_context == nil) {
            _context = [CIContext contextWithOptions:nil];
        }
        CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CGImageRef cgImage = [_context createCGImage:ciImage fromRect:
                              CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))];
        __block UIImage *image = [UIImage imageWithCGImage:cgImage];

        CGImageRelease(cgImage);
        CFRelease(sampleBuffer);

        if(completion) completion(image);

        return;
    }
    if(completion) completion(nil);
}

XCode 和 Instruments 检测到内存泄漏,但我无法摆脱它。 我照常发布 CGImageRef 和 CMSampleBufferRef:

CGImageRelease(cgImage);
CFRelease(sampleBuffer);

[更新] 我放入 AVCapture 输出回​​调以获取 sampleBuffer

- (void)captureOutputAVCaptureOutput *)captureOutput didOutputSampleBufferCMSampleBufferRef)sampleBuffer fromConnectionAVCaptureConnection *)connection {
    if (captureOutput == _videoOutput) {
        _lastVideoBuffer.sampleBuffer = sampleBuffer;
        id<CIImageRenderer> imageRenderer = _CIImageRenderer;

        dispatch_async(dispatch_get_main_queue(), ^{
            @autoreleasepool {
                CIImage *ciImage = nil;
                ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
                if(_context==nil) {
                    _context = [CIContext contextWithOptions:nil];
                }
                CGImageRef processedCGImage = [_context createCGImage:ciImage
                                                             fromRect:[ciImage extent]];
                //UIImage *image=[UIImage imageWithCGImage:processedCGImage];
                CGImageRelease(processedCGImage);
                NSLog(@"Captured image %@", ciImage);
            }
        });

泄露的代码是 createCGImage:ciImage:

CGImageRef processedCGImage = [_context createCGImage:ciImage
                                                             fromRect:[ciImage extent]];

即使有一个 autoreleasepoolCGImage 引用的 CGImageRelease 和一个 CIContext 作为实例属性。

这似乎与此处解决的问题相同:Can't save CIImage to file on iOS without memory leaks

[更新] 泄漏似乎是由于一个错误。这个问题在 Memory leak on CIContext createCGImage at iOS 9?

一个示例项目显示了如何重现此泄漏:http://www.osamu.co.jp/DataArea/VideoCameraTest.zip

最后的评论确保

It looks like they fixed this in 9.1b3. If anyone needs a workaround that works on iOS 9.0.x, I was able to get it working with this:

在测试代码中(在本例中为 Swift):

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)  
    {  
        if (error) return;  

        __block NSString *filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat"ipdf_pic_%i.jpeg",(int)[NSDate date].timeIntervalSince1970]];  

        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];  
        dispatch_async(dispatch_get_main_queue(), ^  
        {  

            @autoreleasepool  
            {  
                CIImage *enhancedImage = [CIImage imageWithData:imageData];  

                if (!enhancedImage) return;  

                static CIContext *ctx = nil; if (!ctx) ctx = [CIContext contextWithOptions:nil];  

                CGImageRef imageRef = [ctx createCGImage:enhancedImage fromRect:enhancedImage.extent format:kCIFormatBGRA8 colorSpace:nil];  

                UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationRight];  

                [[NSFileManager defaultManager] createFileAtPath:filePath contents:UIImageJPEGRepresentation(image, 0.8) attributes:nil];  

                CGImageRelease(imageRef);  
            }  
        });  
    }]; 

iOS9.0 的解决方法应该是

extension CIContext {  
    func createCGImage_(image:CIImage, fromRect:CGRect) -> CGImage {  
        let width = Int(fromRect.width)  
        let height = Int(fromRect.height)  

        let rawData =  UnsafeMutablePointer<UInt8>.alloc(width * height * 4)  
        render(image, toBitmap: rawData, rowBytes: width * 4, bounds: fromRect, format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB())  
        let dataProvider = CGDataProviderCreateWithData(nil, rawData, height * width * 4) {info, data, size in UnsafeMutablePointer<UInt8>(data).dealloc(size)}  
        return CGImageCreate(width, height, 8, 32, width * 4, CGColorSpaceCreateDeviceRGB(), CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue), dataProvider, nil, false, .RenderingIntentDefault)!  
    }  
}  



Best Answer-推荐答案


我们在创建的应用程序中遇到了类似的问题,我们正在使用 OpenCV 处理特征关键点的每一帧,并每隔几秒钟发送一帧。运行一段时间后,我们最终会收到很多内存压力消息。

我们设法通过在它自己的自动释放池中运行我们的处理代码来纠正这个问题(jpegDataFromSampleBufferAndCrop 所做的事情与您正在做的事情类似,但增加了裁剪):

- (void)captureOutputAVCaptureOutput *)captureOutput didOutputSampleBufferCMSampleBufferRef)sampleBuffer fromConnectionAVCaptureConnection *)connection
{
        @autoreleasepool {

            if ([self.lastFrameSentAt timeIntervalSinceNow] < -kContinuousRateInSeconds) {

                NSData *imageData = [self jpegDataFromSampleBufferAndCrop:sampleBuffer];

                if (imageData) {
                    [self processImageData:imageData];
                }

                self.lastFrameSentAt = [NSDate date];

                imageData = nil;
            }
        }
    }
}

关于ios - CMSampleBufferGetImageBuffer 中的内存泄漏,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32685756/






欢迎光临 OStack程序员社区-中国程序员成长平台 (https://ostack.cn/) Powered by Discuz! X3.4