10

I am provided with pixelbuffer, which I need to attach to rtmpStream object from lf.swift library to stream it to youtube. it looks like this: rtmpStream.appendSampleBuffer(sampleBuffer: CMSampleBuffer, withType: CMSampleBufferType)

So, I need to convert somehow CVPixelbuffer to CMSampleBuffer to append to rtmpStream.

var sampleBuffer: CMSampleBuffer? = nil
    var sampleTimingInfo: CMSampleTimingInfo = kCMTimingInfoInvalid
    sampleTimingInfo.presentationTimeStamp = presentationTime

    var formatDesc: CMVideoFormatDescription? = nil
    _ = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &formatDesc)

    if let formatDesc = formatDesc {
        CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault, pixelBuffer, formatDesc, &sampleTimingInfo, &sampleBuffer)
    }

    if let sampleBuffer = sampleBuffer {
        self.rtmpStream.appendSampleBuffer(sampleBuffer, withType: CMSampleBufferType.video)
    }

but, unfortunately this doesn't work. Streaming library is tested and works fine when I stream camera input or screenCapture. I think problem may be sampleTimingInfo, because it requrires decodeTime and Duration which I have no idea how to get for provided CVPixelBuffer.

4

0 回答 0