8

有人知道是否有能力将广播上传扩展中的帧缓冲区上传到主机应用程序,或者我应该将它们直接加载到后端?我的目标是从重播工具包中截取帧缓冲区,将它们发送到我的应用程序并使用 Webrtc 通过我的应用程序广播视频。将不胜感激任何帮助。提前致谢。

4

3 回答 3

8

广播开始时仅加载广播上传扩展和广播 UI 扩展。据我所知,没有任何编程方式可以启动您的主机应用程序并在后台将任何数据流式传输到它。

但是您可以在广播上传扩展中实现整个逻辑。您的RPBroadcastSampleHandler实现由 video 提供CMSampleBuffer。所有后处理和上传逻辑都取决于实现。因此,您可以解压缩和处理帧,然后以任何合适的方式上传到您的服务器。如果您需要任何配置或授权详细信息,您可以简单地在广播 UI 扩展甚至您的主机应用程序中设置它们,然后将它们存储在共享存储中。

互联网上或 Apple 文档中都没有太多关于它的信息。但您仍然可以:

于 2016-11-18T11:04:32.453 回答
7

I tried exactly same thing with Replay Kit and webRTC combination. The fundamental issue of webRTC on iOS, webRTC can't handle video stream if it goes to background. So.. you can stream your app's screen video stream to webRTC while your video chat app is in foreground, but to stream other app, the moment your app goes background, you might not be able to handle video stream but only voice over webRTC.

You'd better upload it to server from upload extension, I've already wasted too much time to connect upload extension to host app.. there is absolutely no control over upload extension.

于 2018-06-08T17:24:06.123 回答
6

我有一些代码给你,我已经在我的项目中实现了它并在 google-groups 上讨论了它: https ://groups.google.com/d/msg/discuss-webrtc/jAHCnB12khE/zJEu1vyUAgAJ

我将在这里为下一代传输代码:

首先,我在广播扩展中创建了额外的类来管理 WebRTC 相关代码并将其称为 PeerManager。

使用本地流设置视频轨道,请注意,您应该在生成本地报价之前执行此操作。

private func setupVideoStreaming() {        
        localStream = webRTCPeer.peerConnectionFactory.mediaStream(withStreamId: "\(personID)_screen_sharing")
        videoSource = webRTCPeer.peerConnectionFactory.videoSource()
        videoCapturer = RTCVideoCapturer(delegate: videoSource)
        videoSource.adaptOutputFormat(toWidth: 441, height: 736, fps: 15)
        let videoTrack = webRTCPeer.peerConnectionFactory.videoTrack(with: videoSource, trackId: "screen_share_track_id")
        videoTrack.isEnabled = true
        localStream.addVideoTrack(videoTrack)
        for localStream in webRTCPeer.localPeerConnection.peerConnection.localStreams {
            webRTCPeer.localPeerConnection.peerConnection.remove(localStream)

        }
        webRTCPeer.localPeerConnection.peerConnection.add(localStream)
    }

我从为我提供 CMSampleBuffer 的系统中获得了回调,我将其转换为 RTCVideoFrame 并发送到 videoSource(模拟 VideoCapturer)

override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
        switch sampleBufferType {
            case RPSampleBufferType.video:
                // Handle video sample buffer
                guard peerManager != nil, let imageBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
                    break
                }
                let pixelFormat = CVPixelBufferGetPixelFormatType(imageBuffer) // kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
                let timeStampNs: Int64 = Int64(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * 1000000000)                
                let rtcPixlBuffer = RTCCVPixelBuffer(pixelBuffer: imageBuffer)
                let rtcVideoFrame = RTCVideoFrame(buffer: rtcPixlBuffer, rotation: ._0, timeStampNs: timeStampNs)
                peerManager.push(videoFrame: rtcVideoFrame)
            case RPSampleBufferType.audioApp:
                break
            case RPSampleBufferType.audioMic:
                break
        }
    }

来自 peerManager 的代码,它是上面代码中推送功能的实现。这里没什么奇怪的,我们使用委托来模拟 Capturer 的行为。

 func push(videoFrame: RTCVideoFrame) {
        guard isConnected, videoCapturer != nil, isProcessed else {
            return
        }
        videoSource.capturer(videoCapturer, didCapture: videoFrame)
    }

现在您已准备好生成本地报价、发送和传输任何您想要的数据。尝试检查您的本地报价,如果您做的一切正确,您应该在报价中看到a=sendonly

PS 正如VladimirTechMan的建议,您也可以在 AppRTCMobile 演示应用程序中查看广播扩展的示例代码。我为您找到了链接,它是 Objective-C 示例https://webrtc.googlesource.com/src/+/358f2e076051d28b012529d3ae6a080838d27209 您应该对 ARDBroadcastSampleHandler.m/.h 和 ARDExternalSampleCapturer.m/.h 文件感兴趣。永远不要忘记,您可以根据说明自行构建它https://webrtc.org/native-code/ios/

于 2018-07-10T11:14:00.240 回答