10

我的目标是在真实环境中使用ARKit. 动画角色是视频的一部分,在以下视频快照中呈现:

视频截图

使用以下代码可以毫无问题地显示视频本身:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
    guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }

    let url = URL(fileURLWithPath: urlString)
    let asset = AVAsset(url: url)
    let item = AVPlayerItem(asset: asset)
    let player = AVPlayer(playerItem: item)

    let videoNode = SKVideoNode(avPlayer: player)
    videoNode.size = CGSize(width: 200.0, height: 150.0)
    videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

    return videoNode
}

此代码的结果如预期的那样显示在下面应用程序的屏幕截图中:

应用截图#1

但是正如你所看到的,人物的背景不是很好,所以我需要让它消失,以创造人物实际上站在水平平面上的错觉。我试图通过对视频制作色度键效果来实现这一点。

  • 对于那些不熟悉色度键的人来说,这是有时在电视上看到的“绿屏效果”的名称,用于使颜色透明。

我对色度键效果的方法是创建一个基于 的自定义过滤器"CIColorCube" CIFilter,然后使用 .将过滤器应用于视频AVVideoComposition

首先,是创建过滤器的代码:

func RGBtoHSV(r : Float, g : Float, b : Float) -> (h : Float, s : Float, v : Float) {
    var h : CGFloat = 0
    var s : CGFloat = 0
    var v : CGFloat = 0
    let col = UIColor(red: CGFloat(r), green: CGFloat(g), blue: CGFloat(b), alpha: 1.0)
    col.getHue(&h, saturation: &s, brightness: &v, alpha: nil)
    return (Float(h), Float(s), Float(v))
}

func colorCubeFilterForChromaKey(hueAngle: Float) -> CIFilter {

    let hueRange: Float = 20 // degrees size pie shape that we want to replace
    let minHueAngle: Float = (hueAngle - hueRange/2.0) / 360
    let maxHueAngle: Float = (hueAngle + hueRange/2.0) / 360

    let size = 64
    var cubeData = [Float](repeating: 0, count: size * size * size * 4)
    var rgb: [Float] = [0, 0, 0]
    var hsv: (h : Float, s : Float, v : Float)
    var offset = 0

    for z in 0 ..< size {
        rgb[2] = Float(z) / Float(size) // blue value
        for y in 0 ..< size {
            rgb[1] = Float(y) / Float(size) // green value
            for x in 0 ..< size {

                rgb[0] = Float(x) / Float(size) // red value
                hsv = RGBtoHSV(r: rgb[0], g: rgb[1], b: rgb[2])
                // TODO: Check if hsv.s > 0.5 is really nesseccary
                let alpha: Float = (hsv.h > minHueAngle && hsv.h < maxHueAngle && hsv.s > 0.5) ? 0 : 1.0

                cubeData[offset] = rgb[0] * alpha
                cubeData[offset + 1] = rgb[1] * alpha
                cubeData[offset + 2] = rgb[2] * alpha
                cubeData[offset + 3] = alpha
                offset += 4
            }
        }
    }
    let b = cubeData.withUnsafeBufferPointer { Data(buffer: $0) }
    let data = b as NSData

    let colorCube = CIFilter(name: "CIColorCube", withInputParameters: [
        "inputCubeDimension": size,
        "inputCubeData": data
        ])
    return colorCube!
}

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode?然后通过修改我之前编写的函数将过滤器应用于视频的代码:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
    guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }

    let url = URL(fileURLWithPath: urlString)
    let asset = AVAsset(url: url)

    let filter = colorCubeFilterForChromaKey(hueAngle: 38)
    let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
        let source = request.sourceImage
        filter.setValue(source, forKey: kCIInputImageKey)
        let output = filter.outputImage

        request.finish(with: output!, context: nil)
    })

    let item = AVPlayerItem(asset: asset)
    item.videoComposition = composition
    let player = AVPlayer(playerItem: item)

    let videoNode = SKVideoNode(avPlayer: player)
    videoNode.size = CGSize(width: 200.0, height: 150.0)
    videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

    return videoNode
}

alpha = 0.0如果像素颜色与背景的色调范围匹配,则该代码应该替换视频每一帧的所有像素。但我没有得到透明像素,而是让这些像素变黑,如下图所示:

应用截图#2

现在,尽管这不是想要的效果,但我并不感到惊讶,因为我知道这是 iOS 显示带有 alpha 通道的视频的方式。但这是真正的问题 - 当在 an 中显示普通视频时AVPlayer,有一个选项可以将 an 添加AVPlayerLayer到视图中,并设置pixelBufferAttributes为它,让播放器层知道我们使用透明像素缓冲区,如下所示:

let playerLayer = AVPlayerLayer(player: player)
playerLayer.bounds = view.bounds
playerLayer.position = view.center
playerLayer.pixelBufferAttributes = [(kCVPixelBufferPixelFormatTypeKey as String): kCVPixelFormatType_32BGRA]
view.layer.addSublayer(playerLayer)

此代码为我们提供了一个具有透明背景 ( GOOD! ) 但大小和位置固定 ( NOT GOOD... ) 的视频,如您在此屏幕截图中所见:

应用截图#3

我想达到同样的效果,但是 on SKVideoNode,而不是 on AVPlayerLayer。但是,我找不到任何设置pixelBufferAttributes为 的方法SKVideoNode,并且设置播放器层并不能达到预期的效果,ARKit因为它是固定位置的。

我的问题是否有任何解决方案,或者是否有另一种技术可以达到相同的预期效果?

4

2 回答 2

9

解决方法很简单!所需要做的就是将视频添加为 a 的子级SKEffectNode并将过滤器应用于SKEffectNode而不是视频本身(AVVideoComposition不是必需的)。这是我使用的代码:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
    // Create and configure a node for the anchor added to the view's session.
    let bialikVideoNode = videoNodeWith(resourceName: "Tsina_05", ofType: "mp4")
    bialikVideoNode.size = CGSize(width: kDizengofVideoWidth, height: kDizengofVideoHeight)
    bialikVideoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

    // Make the video background transparent using an SKEffectNode, since chroma-key doesn't work on video
    let effectNode = SKEffectNode()
    effectNode.addChild(bialikVideoNode)
    effectNode.filter = colorCubeFilterForChromaKey(hueAngle: 120)

    return effectNode
}

这是需要的结果: 在此处输入图像描述

于 2018-05-20T21:03:32.597 回答
2

谢谢!有同样的问题 + 混合 [AR/Scene/Sprite]Kit。但我建议改用这个算法。它给出了更好的结果:

...
var r: [Float] = removeChromaKeyColor(r: rgb[0], g: rgb[1], b: rgb[2])
                cubeData[offset] = r[0]
                cubeData[offset + 1] = r[1]
                cubeData[offset + 2] = r[2]
                cubeData[offset + 3] = r[3]
                offset += 4
...

func removeChromaKeyColor(r: Float, g: Float, b: Float) -> [Float] {
    let threshold: Float = 0.1
    let refColor: [Float] = [0, 1.0, 0, 1.0]    // chroma key color

    //http://www.shaderslab.com/demo-40---video-in-video-with-green-chromakey.html
    let val = ceil(saturate(g - r - threshold)) * ceil(saturate(g - b - threshold))
    var result = lerp(a: [r, g, b, 0.0], b: refColor, w: val)
    result[3] = fabs(1.0 - result[3])

    return result
}

func saturate(_ x: Float) -> Float {
    return max(0, min(1, x));
}

func ceil(_ v: Float) -> Float {
    return -floor(-v);
}

func lerp(a: [Float], b: [Float], w: Float) -> [Float] {
    return [a[0]+w*(b[0]-a[0]), a[1]+w*(b[1]-a[1]), a[2]+w*(b[2]-a[2]), a[3]+w*(b[3]-a[3])];
}
于 2019-06-21T14:01:04.783 回答