1

我已经编写了代码来初始化 3 个 Reality Composer 场景中的一个,当按下按钮时,具体取决于一个月中的哪一天。

这一切都很好。

Reality Composer 场景使用图像检测将对象放置在环境中,但目前只要图像离开相机视图,对象就会消失。

我想将场景锚定在第一次检测到图像的根节点,这样用户就可以环顾场景,即使图像触发器不在相机视图中,也可以维护对象。

我尝试在下面传递一个 func 渲染器代码,但我收到错误说视图控制器类没有 .planeNode

 func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
            guard let imageAnchor = anchor as? ARImageAnchor else { return }
            let referenceImage = imageAnchor.referenceImage

                // Create a plane to visualize the initial position of the detected image.
                let plane = SCNPlane(width: referenceImage.physicalSize.width,
                                 height: referenceImage.physicalSize.height)
                plane.materials.first?.diffuse.contents = UIColor.blue.withAlphaComponent(0.20)
                self.planeNode = SCNNode(geometry: plane)

                self.planeNode?.opacity = 1

                /*
                 `SCNPlane` is vertically oriented in its local coordinate space, but
                 `ARImageAnchor` assumes the image is horizontal in its local space, so
                 rotate the plane to match.
                 */
                self.planeNode?.eulerAngles.x = -.pi / 2

                /*
                 Image anchors are not tracked after initial detection, so create an
                 animation that limits the duration for which the plane visualization appears.
                 */

                // Add the plane visualization to the scene.
                if let planeNode = self.planeNode {
                    node.addChildNode(planeNode)
                }

                if let imageName = referenceImage.name {
                    plane.materials = [SCNMaterial()]
                    plane.materials[0].diffuse.contents = UIImage(named: imageName)
                }

这是我的代码

import UIKit
import RealityKit
import ARKit
import SceneKit



class ViewController: UIViewController {



@IBOutlet var move: ARView!
    @IBOutlet var arView: ARView!

    var ARBorealAnchor3: ARboreal.ArBoreal3!

    var ARBorealAnchor2: ARboreal.ArBoreal2!

    var ARBorealAnchor: ARboreal.ArBoreal!

    var Date1 = 1




    override func viewDidLoad() {
        super.viewDidLoad()



        func getSingle() {
            let date = Date()
            let calendar = Calendar.current
            let day = calendar.component(.day, from: date)
            Date1 = day
        }

     getSingle()

      ARBorealAnchor = try! ARboreal.loadArBoreal()

        ARBorealAnchor2 = try!
        ARboreal.loadArBoreal2()

        ARBorealAnchor3 = try!
              ARboreal.loadArBoreal3()



        if Date1 == 24 {
            arView.scene.anchors.append(ARBorealAnchor)
        }
        if Date1 == 25 {
            arView.scene.anchors.append(ARBorealAnchor2)
        }
        if Date1 == 26 {
            arView.scene.anchors.append(ARBorealAnchor3)
        }
    }
}

任何帮助将不胜感激。

干杯,丹尼尔萨维奇

4

1 回答 1

4

正在发生的事情是,当图像锚点离开视野时,AnchorEntity 变得未锚定,然后 RealityKit 将停止渲染它及其所有后代。

解决此问题的一种方法可能是将图像锚点和要渲染的内容分开,在代码中手动添加图像锚点,然后当第一次检测到图像锚点时,将内容添加到不同世界锚点下的场景中。当图像锚变换更新时,更新您的世界锚以匹配。

这样,您可以在图像锚点可见时使用它来获取最新的变换,但是当它消失时,内容的渲染不会与之绑定。如下所示(您必须创建一个名为 ARTest 的 AR 资源组,并向其中添加一个名为“test”的图像,以使锚点正常工作):

import ARKit
import SwiftUI
import RealityKit
import Combine

struct ContentView : View {
    var body: some View {
        return ARViewContainer().edgesIgnoringSafeArea(.all)
    }
}

let arDelegate = SessionDelegate()

struct ARViewContainer: UIViewRepresentable {

  func makeUIView(context: Context) -> ARView {

    let arView = ARView(frame: .zero)

    arDelegate.set(arView: arView)
    arView.session.delegate = arDelegate

    // Create an image anchor, add it to the scene. We won't add any
    // rendering content to the anchor, it will be used only for detection
    let imageAnchor = AnchorEntity(.image(group: "ARTest", name: "test"))
    arView.scene.anchors.append(imageAnchor)

    return arView
  }

  func updateUIView(_ uiView: ARView, context: Context) {}
}

final class SessionDelegate: NSObject, ARSessionDelegate {
  var arView: ARView!
  var rootAnchor: AnchorEntity?

  func set(arView: ARView) {
    self.arView = arView
  }

  func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {

    // If we already added the content to render, ignore
    if rootAnchor != nil {
       return
    }

    // Make sure we are adding to an image anchor. Assuming only
    // one image anchor in the scene for brevity.
    guard anchors[0] is ARImageAnchor else {
      return
    }

    // Create the entity to render, could load from your experience file here
    // this will render at the center of the matched image
    rootAnchor = AnchorEntity(world: [0,0,0])
    let ball = ModelEntity(
      mesh: MeshResource.generateBox(size: 0.01),
      materials: [SimpleMaterial(color: .red, isMetallic: false)]
    )
    rootAnchor!.addChild(ball)

    // Just add another model to show how it remains in the scene even
    // when the tracking image is out of view.
    let ball2 = ModelEntity(
      mesh: MeshResource.generateBox(size: 0.10),
      materials: [SimpleMaterial(color: .orange, isMetallic: false)]
    )
    ball.addChild(ball2)
    ball2.position = [0, 0, 1]

    arView.scene.addAnchor(rootAnchor!)
  }

  func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
    guard let rootAnchor = rootAnchor else {
      return
    }

    // Code is assuming you only have one image anchor for brevity
    guard let imageAnchor = anchors[0] as? ARImageAnchor else {
      return
    }

    if !imageAnchor.isTracked {
      return
    }

    // Update our fixed anchor to image transform
    rootAnchor.transform = Transform(matrix: imageAnchor.transform)
  }

}

#if DEBUG
struct ContentView_Previews : PreviewProvider {
  static var previews: some View {
    ContentView()
  }
}
#endif

注意:当您四处移动时,ARImageAnchor 的变换似乎会频繁更新,因为 ARKit 正在尝试计算准确的图像平面(例如,内容可能看起来在正确的位置,但 z 值不准确),请确保您的图像尺寸在 AR 资源组中是准确的,以便图像获得更好的跟踪。

于 2019-10-23T19:36:44.093 回答