1

任务:实时录制视频并应用过滤器

问题从修改后的 UIImage 获取 CVPixelBuffer 太慢

我的相机的输出正在被过滤并直接进入 UIImageView 以便用户可以实时看到效果,即使在不录制视频或拍照时也是如此。我想要一些方法来记录这个不断变化的 UIImage 到视频,所以它不需要像我现在做的那样。目前,我通过将 CVPixelBuffer 附加到assetWriter 来做到这一点,但是由于我将过滤器应用于 UIImage,因此我将 UIImage 转换回缓冲区。我已经测试了使用和不使用 UIImage -> 缓冲区的情况,所以我已经证明这会导致不可接受的减速。

下面是 captureOutput 中的代码,注释清楚发生了什么,以及获取 UIImage 缓冲区的方法:

// this function is called to output the device's camera output in realtime
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection){

        if(captureOutput){

            // create ciImage from buffer
            let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
            let cameraImage = CIImage(cvPixelBuffer: pixelBuffer!)

            // set UIImage to ciImage
            image = UIImage(ciImage: cameraImage)


            if let ciImage = image?.ciImage {

                // apply filter to CIImage
                image = filterCIImage(with:ciImage)

                // make CGImage and apply orientation
                image = UIImage(cgImage: (image?.cgImage)!, scale: 1.0, orientation: UIImageOrientation.right)

                // get format description, dimensions and current sample time
                let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer)!
                self.currentVideoDimensions = CMVideoFormatDescriptionGetDimensions(formatDescription)
                self.currentSampleTime = CMSampleBufferGetOutputPresentationTimeStamp(sampleBuffer)

                // check if user toggled video recording 
                // and asset writer is ready
                if(videoIsRecording && self.assetWriterPixelBufferInput?.assetWriterInput.isReadyForMoreMediaData == true){
                    // get pixel buffer from UIImage - SLOW!
                    let filteredBuffer = buffer(from: image!)

                    // append the buffer to the asset writer
                    let success = self.assetWriterPixelBufferInput?.append(filteredBuffer!, withPresentationTime: self.currentSampleTime!)

                    if success == false {
                        print("Pixel Buffer failed")
                    }

                }

            }


            DispatchQueue.main.async(){
                // update UIImageView with filtered camera output
                imageView!.image = image
            }

        }

    }


    // UIImage to buffer method:
    func buffer(from image: UIImage) -> CVPixelBuffer? {
        let attrs = [kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue, kCVPixelBufferCGBitmapContextCompatibilityKey: kCFBooleanTrue] as CFDictionary
        var pixelBuffer : CVPixelBuffer?
        let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(image.size.width), Int(image.size.height), kCVPixelFormatType_32ARGB, attrs, &pixelBuffer)
        guard (status == kCVReturnSuccess) else {
            return nil
        }

        CVPixelBufferLockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))
        let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer!)

        let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
        let context = CGContext(data: pixelData, width: Int(image.size.width), height: Int(image.size.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer!), space: rgbColorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue)

        context?.translateBy(x: 0, y: image.size.height)
        context?.scaleBy(x: 1.0, y: -1.0)

        UIGraphicsPushContext(context!)
        image.draw(in: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
        UIGraphicsPopContext()
        CVPixelBufferUnlockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))

        return pixelBuffer
    }
4

0 回答 0